While it will be good to have html 5 and all the additions that they added to it, some thing about it made me pause. They'll have all these input types do you don't have to do client side validation with js hacks etc. The UI won't even allow you do these in your browser, js or no. Does this mean that there'll be a lot more possible server hacks because people are not validating their input anymore? Will this matter for the most part, or will the app just kack and throw a 500 error? Do we care to have a nicer error message than that?
It'll be awesome when you'll be able to put an annotation (in java) on an attribute and have this validated in the browser, server side mvc layer, db layer before the persist (via jpa validation)... Hell, in 5 years java might have all that magic that rails has 3 years ago... :-P
via /.
Developers need to remember that an HTTP GET or POST can be done without a browser. Input variations from the whole of HTTP should be checked on the server side to handle misbehaving user agents (XSS exploits this). I don't know about the Java libraries but Rails makes this kind of testing easy.
ReplyDeleteClient-side input checking is just a nice bonus for a subset of user agents; the newest browsers. Client-side work can save validation calls to the server but obviously shouldn't be used instead of server validation.
Let's all remember, it's going to be a long time before a large percentage of browsers in use support HTML 5. Until then, if you want all your users to have the same experience, you're going to have to code the Javascript Validation.
ReplyDeleteAnd yes, you still have to do validation at the server. A lot of programmers don't realize you can do HTTP POST/GET without a browser. They are completely oblivious to how it works.
Personally I just like to do server side validation. It makes it a little easier to code. And also makes your validation more consistent between all users. It may take the user a little longer to get things done, because they have to submit the form a couple times before they get things right, but as long as you return the form to them in the same way they submitted it, with everything filled in, it's not really much of a problem.
Yep. I like server side validation because it can be unit tested easier and more thoroughly than client-side Javascript. But having client-side tests are a nice UI improvement and can make a modern website "seem" faster. If the client-side validation lets anything through (accidentally or intentionally) then no harm done because the server-side validation will catch it.
ReplyDelete"A lot of programmers don't realize you can do HTTP POST/GET without a browser." - those people are writing web apps? Ouch.
ReplyDeleteOne thing that I liked with my brief experience with perl cgi was the concept of "safe" variables. You couldn't do some operations on parameters until you sanitized them. Interesting idea.
I've forgotten, but does a browser send any kind of info about what html etc it will support? For example, could you easily write an app that would send back html 5 iff the UA accepts it, or "fall back" to html 4? I'm thinking of the new form widgets since I'm sure FF 2 (for example) will ignore a html input of type "month". That would really suck to test, but if taglibs took care of that, you wouldn't (really) need to worry about it.
Well, most browsers only give the browser name and version, along with the OS. It would be pretty easy to check the browser and version, and have the program figure out whether to use HTML version 5 or not. There's only 4 or 5 different browsers anyway.
ReplyDeleteMaking different versions of a website for specific browsers is just asking for trouble. A better idea is just to wait until the majority of people have a browser that supports HTML 5.
ReplyDeleteThat day is closer than we might think: FireFox updates itself and Windows is getting a lot more aggressive about forcing IE updates. Apple forces people to update Safari too, of course. It'll be nice when people always have the most up-to-date browser by default.
What about the people still using old browsers somehow? Once the vast majority of people have updated browsers, force the rest to upgrade by detecting old browsers and redirecting them to a page explaining how to upgrade. There's far too much legacy support on the web, given that browsers are FREE.
Old browsers aren't worth the cost of supporting them if they are any less than 5% of your potential traffic. This issue is controversial, so I'll don my flameproof suit! ...but I've drawn a line, darnit!
"those people are writing web apps? Ouch"
ReplyDeleteYep. Look no further than the hubbub about the Google toolbar's "browse ahead" feature. It fetched all of the links on a current page automatically so when you clicked on any of them, the page switch was instantaneous because the page was already in the browser cache.
The problem? Lots of developers had delete links on their sites, which are regular HTTP GETs! So browse-ahead would see the delete link and follow it, thinking it was a page to fetch and delete the content!
The browse ahead feature was just following the HTTP spec: GETs aren't supposed to change the state of the website, only passively read. POST, PUT and DELETE are the mutating HTTP actions and browse ahead avoided those.
Of course most of us don't read the HTTP spec that closely because HTTP seems like a lower layer we shouldn't care about. I think back on my early web dev days and I was guilty of it too -- but not now! In my Rails apps I only allow a delete to work on a POST (which is common, though I could also restrict to just DELETE).
Now you've been warned! ...even though browse ahead ultimately failed because too many people were using GETs to mutate their websites and browse ahead was ruining tons of data ...d'oh.
It's not just browse ahead. Search engines will also innocently traverse links as GETs, though most websites have delete links behind a password wall that search engines can't get past. That was the only thing saving them ... and of course browse ahead could go where search engines couldn't.
I think that I've been guilty of the "delete links" myself actually... but most delete operations that I do have a "do you really want to..." step.
ReplyDeleteWhile working in frameworks like struts, when the data gets to the action class I never really cared if it's coming from a post or a get. That makes a lot of sense to restrict it based on the protocol...
One of the designers that I've worked with prefers some operations like edit / delete to be done on images / links. Now that I'm thinking about that, that could be changed to a post and a couple of hidden fields... Humm... I think that I'll bring that up at work.
Too bad that on an anchor you couldn't also specify the method like you can with post...
I also agree that making one version of the website for all browsers is the only way to go. Make maintaining things a lot easier. However, if you have some kind of framework that generates your HTML for you, then it wouldn't be a completely bad idea to have the framework handle spitting out the most recently supported HTML for the browser being used.
ReplyDeleteHowever, about everybody having the newest browser. Some people are still stuck on Windows 98. They can't use IE7, and from what I hear, Firefox 3 won't support windows 98 either. Here's the thing. You just have to realize that no matter what you do, some people just aren't going to be able to view your site. You have to judge the pros and cons and decide for yourself if using a specific feature will do you more good than harm. There's a lot of users out there who disable Javascript, and if your site depends on Javascript, then it won't work for them. Depending on what you want your site to be, you may decide it's ok that they can't view your site, and just focus on the users who do have javascript turned on. Same goes for Flash, Silverlight, RealPlayer, HTML, CSS, and any other technology you want to use.
'...but most delete operations that I do have a "do you really want to..." step'
ReplyDeleteSearch engines and browse-ahead skips* this client-side Javascript confirmation and just deletes the content if a regular link (GET) is used.
*skips: because they don't "click" on links. They read all of the link URIs on a page and then open them.
Ah, let me clarify that. My use of js is *very* minimal since it's a GoC requirement that it works without js. So that "are you sure you want to delete this" is a separate form page that does a "post".
ReplyDeleteI brought up this with our usability guy and he seems to remember that there was a problem. I've created a mock up and will discuss with him again today.
Oh, you meant a separate page and not a Javascript alert. Yes, that works well as long as the confirmation page does a POST to delete the item from the database.
ReplyDelete