-
• #27
We do have very harsh sanitation and validation on LFGSS. It's also a text-based internet forum. We do not need canvas or most of the other new tags.
Your statement is true only if you think that what is currently LFGSS is all it need be. I happen not to believe that, and I do want the newer features when they are available and will be using them when they're widely adopted. You say there is no need for canvas but a skid patch calculator would be much nicer if we had canvas, as would a polo knockout tournament graph or a bike size diagram. Just because we don't have those things doesn't mean you shouldn't dream of having them.
My choice: drag heels in favour of existing support and be constrained by the past or push at the edges to help drive support and allow myself to dream of the future.
I'm choosing the latter.
Besides, most of HTML5 is a small-step improvement over HTML4. The tags make sense, and existing browsers won't treat things like as tag soup if you use CSS to declare footer as a block element with the spacing and font desired.
Using most of the commonly agreed upon (and unlikely to change) parts of HTML5 will result in semantic and cleaner code. That is the point... less code, clearer code, and ultimately from that code that is parsed and rendered faster.
-
• #28
What are your plans with IE?
-
• #29
Sorry, but XHTML 2 is a working draft. Like html 5. It departs from the current HTML standard and utilizes XFORMS, XLINK etc.
http://www.w3.org/TR/xhtml2/introduction.html
It's great that you take pride in your work etc ... and I mostly agree with what you're saying in terms of pride in work and so on. Just that often reality catches up, and somehow people inject invalid code.
I'd love to always have a work environment where I'm not the only person caring about standards, security, automatic testing etc.
Cassandra syndrome I guess.
Maybe it's because I am contracting that I have become more and more sarcastic about things. I had projects where I had deployment tools set up to check every single page and validate it.
Six months later I would come back, to fix something with that very project, and someone had commented out the part that validated and in turn messed up all templates with inline javascript that was shite in the first place and wouldn't pass validation in the second.I think to write a site in semantically correct and high standard HTML4 and at a later stage change the doctype to HTML5, revalidate, change minor issues is a realistic scenario for a website.
It also would be future proof. Why mentioning future proof if XHTML 1.0 is where you leave it? That is just current-state proof.It makes sense however, that you advocate XHTML 1.0 because you use XML tools.
BTW: Is it possible to generate the <!doctype html> thingy with MSXML and the xsl:output directive? With xslt_proc it's just not possible.In regards to different rendering modes, HTML 4 strict with the correct doctype will produce equally good results IMO. These things, as you mentioned, where mapped out a while ago:
http://hsivonen.iki.fi/doctype/
Note that the HTML5 doctype is designed by what it does already, which is triggering standards mode regardless to whether it was known at the time the browser was conceived or not.
Sorry, I don't mean to be a pain in the arse, and it's good to talk with people that know what they're on about, but we might have to agree to disagree in certain points.
-
• #30
What are your plans with IE?
Ignore 6 and perhaps 7 (stats show they are largely irrelevant on here) focus on 8 and 9.
When it comes to canvas, if IE is still lagging then I can use a server-side conditional to include something like the raphael library so that JavaScript can do the canvas work in IE whilst canvas supporting browsers will just do it natively.
Browser breakdown:
1) Firefox (36%)
2) Safari (25.57%)
3) IE (24%)
4) Chrome (10.84%)
5) Opera (1.45%)Mobile browsers come after that as well as the edge cases (me when I'm using lynx from a command line).
IE is 3rd place on LFGSS, and even then for IE, version 8 is 55% of IE users, version 7 is 31% and version 6 is 14% of IE users. Basically IE6 only has a 3% use on here, and on LFGSS Firefox and Safari dominate, but Chrome is the fastest growing browser. IE is in decline across the board and Opera never factored in.
-
• #31
Excanvas and conditional comments?
-
• #32
Excanvas and conditional comments?
That's pretty much what I said, now you're just arguing for the hell of it ;)
But I'd make the conditional server-side if all versions of IE didn't support it, and client side if only earlier versions didn't support canvas.
Basically I don't even want to send the conditional comments if they wouldn't be needed.
-
• #33
Neat. A PHP based sniffer, or is there a neater way?
I remember the good old browscap.ini in asp classic ... YAY!
-
• #34
Neater way, nginx conditional and then adding a header to allow php to simply do a boolean on the header.
I like headers :) People neglect to think about them.
nginx is the software load balancer, it receives the request and forwards it to the back-end appending some headers if necessary.
I use this trick already to enable both http and https to serve the same content whilst keeping links internally working. https triggers a header to be added that would cause PHP to rewrite any http links on the site to https, and the opposite happens on http
-
• #35
Nice ...
nginx ... I did play with that in order to seperate apache served content, static files and long polling scripts for comet. Didn't get around to use it in production so far.
-
• #36
Yeah, nginx is also reverse proxy caching... all static files on static.lfgss.com come from nginx on the load balancer. If the file isn't there it pulls it from a web server and caches it on the load balancer.
Only dynamic traffic should get down to the web servers themselves, everything static should come from the load balancer after the first hit.
-
• #37
Aren't you using a CDN? doesn't caching make server-side User Agent sniffing impractical?
-
• #38
Sorry, but XHTML 2 is a working draft. Like html 5. It departs from the current HTML standard and utilizes XFORMS, XLINK etc.
XHTML 2 was abandoned in favour of XHTML 5 (xml version of HTML 5). Therefore it has no chance of ever progressing beyond working draft status, and therefore doesn't actually exist.
Just that often reality catches up, and somehow people inject invalid code.
I think to write a site in semantically correct and high standard HTML4 and at a later stage change the doctype to HTML5, revalidate, change minor issues is a realistic scenario for a website.
It also would be future proof. Why mentioning future proof if XHTML 1.0 is where you leave it? That is just current-state proof.Because internet forum software is a massive hacker target it can, should be, and is in the case of LFGSS, locked down to prevent code injection. It shouldn't be an issue. (Generally I let go when the client signs off. I deliver valid code and if they subsequently fuck it up that's their problem.)
I am not completely against using HMTL 4, but my experience is that encourages sloppy code, because validators let all sorts of shit through and inexperience developers think that is all that matters. Neither am I against the idea of HTML 5, but I think it's too early to rely upon it. If a really good case can be made for choosing HTML 5 for this project then fine, but so far I am not convinced it has been.
I mean future proof because it's a mature standard, and if you stick to standards and specify the right doctype it will continue to be supported long after it is obsolete.
HTML 5 is not future proof because everything is still up for discussion. What is supported now may not continue to be supported by the time the standard is ratified, or may be supported but in a way that breaks code written now.Is it possible to generate the <!doctype html> thingy with MSXML and the xsl:output directive? With xslt_proc it's just not possible.
It's been a while since I produced XSLT base templates, but from memory, yes, you can put whatever you like as your doctype with an output directive. Most sites I do these days have .Net base templates because of the CMS we use.
Note that the HTML5 doctype is designed by what it does already, which is triggering standards mode regardless to whether it was known at the time the browser was conceived or not.
Whatever name they give it, it cannot be "standards mode" if no standard exists - and for HTML5 it doesn't. It can only be quirks mode until there is a standard to make it compliant with.
HTML 5 is one day going to be very useful, but it's probably a bit early to jump in now.
-
• #39
The tags make sense, and existing browsers won't treat things like as tag soup if you use CSS to declare footer as a block element with the spacing and font desired.
Apart from IE, which will ignore the markup entirely. You need to use JS to create all non-html 4 elements in the DOM, which will lard up your page weight no end.
Using most of the commonly agreed upon (and unlikely to change) parts of HTML5 will result in semantic and cleaner code. That is the point... less code, clearer code, and ultimately from that code that is parsed and rendered faster.
The code will be cleaner, but it is unlikely to be faster because of the way CSS gets processed.
• #40Whow, I didn't see they abandoned XHTML 2 ... Thanks for pointing that out.
• #41I think when people make the difference between Quirks and Standards mode, they talk about different rendering behaviour:
http://www.cs.tut.fi/~jkorpela/quirks-mode.html
Maybe "Standards" mode is a poor choice of words, but the HTML5 doctype triggers this rendering behaviour in most browsers.
• #42Whatever we go with this discussion is very valuable. We need to have all this in mind when turning designs into code. It's important too that the requirements for future functionality are firmed up.
If we're going to add things like spoke calcs, polo league charts, etc, this should all be specced out. We can't make informed design decisions until it is.Incidentally though, when I turn designs into code I do it all by hand, so I'm going to be producing xhtml because that's what I have been hand coding for years. Someone else can change it into HTML5 if that's the way we're going.
• #43Aren't you using a CDN? doesn't caching make server-side User Agent sniffing impractical?
I think you don't know enough about the capabilities of nginx to discuss this bit at the moment.
Even with static serving from a reverse proxy cache, it is still possible to perform server-side UA sniffing. In fact, you can do this with zero overhead... the load balancer runs totally in RAM and performing a little work in there has practically no overhead whatsoever... all I can play with are headers, the request and response, but that's precisely what where the UA is.
Because internet forum software is a massive hacker target it can, should be, and is in the case of LFGSS, locked down to prevent code injection.
I don't trust markup software and filters enough to leave it up to the PHP application.
LFGSS uses mod_security to strip out XSS attacks that vbulletin hasn't taken care of.
We even send HTTP 444 when a foul input has been detected (invalid status causes closure of the connection).
I don't leave up to the application what the systems can secure.
Then again, I don't leave up to the system what the application can secure ;)
• #44I think you don't know enough about the capabilities of nginx to discuss this bit at the moment.
This is true. you would have to explain it in baby speak. I only do code, not servers.
Even with static serving from a reverse proxy cache, it is still possible to perform server-side UA sniffing. In fact, you can do this with zero overhead... the load balancer runs totally in RAM and performing a little work in there has practically no overhead whatsoever... all I can play with are headers, the request and response, but that's precisely what where the UA is.
The reason I ask is that we use Akamai, and if we cache at all we have to avoid all sorts of server side things - Akamai hits our servers and sometimes (not sure why not always or never) sends its own user agent, for example, not that of the requesting user. As far as I knew there was no way to get around this - but what you're saying contradicts what I was told. Our back-end is IIS and .Net - is it possible to do what you intend to do with that?
Akamai, at the moment would also cache any user-agent specific response, so person no.1 on IE gets a new copy of the document with IE mods, but then for the next 10 mins persons 2, 3 4, etc, all on different browsers would also get that IE-specific document served from Akamai's cache rather than a their own specific one. Until the next revisit interval anyway.
• #45This convo about doc-types and load balancing etc has gone well over my head to a certain extent .... I either use Transitional or Strict just because I can use the validator to check my markup is correct which solves numerous rendering problems cross browser.
Mostly I am really looking forward to working on something I really wanna work on, and my CSS skills are basically getting pretty good now ... so I am looking forward to having a nice design and getting to work on something I think I can do a really good job on. :D ... I am damn good at getting IE6 and 7 to behave without having to resort to alternative stylesheet or hacks (except with PNG support) ... always ... argh I need to trigger the "has layout".
I think it will make a pleasant difference from work where I have the manager breathing down my neck (who is a travel agent not a developer), asking me why "I am coding", and my specification is a PNG if I am lucky. So I am roaring to go on something I really want to get my teeth into.
• #46A fair amount has gone over my head also, but it sounds like you know what you want, and you know your reasons for wanting it, which is always good!
• #47Do the entire site in PDF form, and use a realtime PDF templating tool to generate it. That'll work, and you can shove your standards up your arse...
• #48PDF? Really... LaTeX surely ;)
• #49Make the entire site one big Java applet.
• #50Entirely txt file based?
There are standards modes and standards mode. some browsers treat HTML 4 standards mode differently to XHTML 1.0 standards mode. XHTML 1.0 transition or strict produce the most cross-browser consistent results. HTML 4 is badly defined and thus so many things are open to interpretation, even in strict mode. You can write exactly the same code, all error-free in xhtml 1.0 and html 4, served up with the correct doctype, and have different results.
XHTML 2 doesn't exist.
Which is why I specifically recommend 1.0 transitional
You don't "move to html 5". Your doctype ensures that it remains xhtml 1.0 transitional.
Speaking about how I run projects commercially, People on my teams that forget about standards get forcefully told to bloody well remember about them! It's also a matter of professional pride. If a developer doesn't care about producing standards compliant code I don't want them on my project. User generated content can be a problem. Everything I produce uses an XSLT-driven content management system which ensures XML grammar at least. I also run content through a C# function to remove obsolete tags and replace some characters with entities. It mitigates some of the issues, and I configure the CMS fairly rigorously too. However with this site that's not going to be an issue since there is no rich text content creation to fuck things up. It's important to be XML compliant when you are also creating RSS feeds too, which LFGSS does.
That is not an issue in the case of LFGSS.
You say forgiving. I say badly defined. And bringing up the application/xhtml+xml mime type is irrelevant as we won't be using it. text/html is the only thing that works, so is the only thing that gets used. Even with text/html it triggers tighter standards compliant rendering than HTML 4 in some browsers. HTML 5 is not a standard at all, so of course it's 'forgiving'. In IE it is completely ignored, unless you pollute your code with javascript.
We do have very harsh sanitation and validation on LFGSS. It's also a text-based internet forum. We do not need canvas or most of the other new tags
I don't understand how one reaches the conclusion that because browsers are not strict enough in their standards implementation the solution is to take a gamble on something which isn't even a standard, in the blind hope that it will produce more consistent results. It's not. The solution is to be rigorous in your code and merciless in your adherence to the best-performing standard that we do have. That is XHTML 1.0 transitional, served as text/html.