Embeddable web view in apps
by Richard Gale
It seems like netsurf is a great candidate for providing HTML based UI in apps. Is this something anyone has tried from a technical point of view?
Does the current licensing support this for paid for apps or is netsurf licensable for such purposes?
Thanks,Richard.
8 years, 7 months
IDNA2008 - take 2
by Chris Young
OK, second attempt at international domain name support.
Branch: chris/idna2008
I've had to import some unrestricted code from elsewhere, due to the
necessity of Unicode normalisation and other things. It is working
and conforming to the spec, as far as I read it.
A couple of minor issues/todos:
1. If an invalid URL is encountered during page layout/box conversion,
NetSurf gives a BoxConvert warning and the page is never displayed.
This is caused by my new code making nsurl_create return
NSERROR_BAD_URL when an IDN fails the compliance checks.
I've not been able to work out where in the core this error code is
terminating page layout.
Page showing this problem:
http://blogs.msdn.com/b/shawnste/archive/2006/09/14/idn-test-urls.aspx
2. If a frontend wants to display the UTF-8 version of an IDN then
currently the URL needs stripping into component parts, the host run
through idna_decode() and the whole thing put back together again.
This should probably be handled by nsurl but I'm not sure of the best
way to implement it.
3. There are some to-dos noted in code comments for further compliance
checking. They are optional in the spec, and I don't see any need to
implement them - anything invalid will be rejected by DNS. Most of
the mandatory checks seem overkill anyway, given that there is
stricter checking at DNS registration time.
I have included the optional decode-reencode check for already encoded
addresses to weed out any undecodeable nonsense the user might have
typed in, but it doesn't bother to do normalisation or validity
checking of the decoded address before re-encoding it (maybe it
should, I'm not sure, the spec was vague on this point).
Chris
8 years, 12 months
HTTP 2.0
by Chris Young
I've just put a small patch in branch chris/http2 to enable libcurl to
use HTTP 2.0 if it is the latest version.
libcurl also needs building against nghttp2 for this to do anything
(there is an equivalent patch in the toolchains chris/http2 branch)
As it is still in draft I've put an option in to enable it, rather
than it being enabled by default. Certainly Google's server - if not
connecting over https - returns an error. Over https it works fine
(and appears to have negotiated http2 to some extent), so I think it's
probably just not up-to-date with the latest spec.
I'll temporarily add myself to the curl mailing list and ask about
that.
Anyway, it's there for testing, merging, criticising, whatever.
Chris
9 years
NetSurf 3.1 AmigaOS build
by Chris Young
Is here: http://www.os4depot.net/share/network/browser/netsurf.lha
I was under the impression that Jenkins was going to build this but as
it didn't seem forthcoming I built it manually myself.
I'll replace it with the Jenkins build if we get one later, otherwise
the above archive can be taken for the NetSurf website.
If v3.0 is needed to complete the set (as it was never on the NetSurf
website), I've kept hold of a copy of that old archive.
Chris
9 years, 1 month
Perf and samples
by Elie Roudninski
Hi everyone,
I tried libhubbub to parse html in sax mode but i noticed some performances
problems for big files. It takes more than 10 seconds to parse it whereas
libxml takes ~1 sec.
According to kcachegrind, 83% of time is spent in memmove.
The call graph is in attachment.
As i understand the problem, libparserutils keeps refilling it's internal
utf8 buffer again and again and again.
Performances are very important to me, so i started my own sax parser :
https://github.com/marmeladema/Saxxy (comments are appreciated) but i would
like to know if you have some kind of test file database to validate
libhubbub so that i can test my own parser ?
Thanks in advance
adema
9 years, 1 month
NetSurf 3.1 last call
by Vincent Sanders
Last call for the 3.1 release. If you have any fixes or updates that
ought to go in this release now is the time.
I will be tagging the release and start generating release tarballs
for libraries and NetSurf during the weekend 12th and 13th April.
The CI system will be used to build release binaries which will be
tested during the following week.
If there are critical problems with the proposed release it will be
redone the next weekend (19th and 20th April).
The general release to public will be done on the 26th April.
--
Regards Vincent
http://www.kyllikki.org/
9 years, 1 month
Static analysis with cppcheck
by Michael Drake
We've added cppcheck to the CI server.
You can get to its output from
http://ci.netsurf-browser.org/jenkins/job/cppcheck-netsurf/
Unlike the scan-build static analysis tool, this one analyses
everything, so it includes all the platform code for e.g. amiga, atari,
riscos, etc.
The scan-build results are also available on the CI server but only
cover core, framebuffer front end, and our library code.
Hopefully the cppcheck results will be useful. I've fixed a few issues
it found in riscos/ already. It's not found much in the core, but that
code is already well checked by Coverity and scan-build.
Cheers,
Michael
9 years, 1 month