General discussion


Standards Compliance In Website Design.

By Jaqui ·
Standards Compliance In Website Design.

Why have we let the policies of one company dictate to us that we will not be responsible professionals and insist on being standards compliant?
What would you do if an electrician didn't insist on making your home or office standards compliant? Would you accept that standards are not to be met? Or would you insist on the standards being met?

I assume that we all would insist on the standards being met.

So why have we let one companies policy of ignoring standards dictate that we cannot be responsible professionals and create standards compliant websites?
I would suggest that the next time someone asks for a non standards compliant website to be designed, that we insist that they sign a legally binding document stating that they are accepting full responsibility for any damages incurred by visitors to their site due to data loss, virus, phishing, or any other means.
If they won't sign this document, then they will only get standards compliant website.
We are the professionals, so we are the ones that have to take the responsibility into our own hands for meeting the standards of our profession.

It's all well and good to say that we have to make sure that everyone can visit the website. But that is pure fud.*
It's the responsibility of the browser developers to make the browsers standards compliant. If we don't make compliant sites the norm, then they have to incentive to make the browsers compliant.

So, that leads to the question:

What is the standard for website design?

According to the w3c, it's xml and xhtml.
Since the World Wide Web Consortium is the de facto body in charge of setting web standards, their standard is the one we should be meeting.

* fud used instead of " unadulterated bullsh|t"

This conversation is currently closed to new comments.

Thread display: Collapse - | Expand +

All Comments

Collapse -

Rules of the Road for the Internet?

by stress junkie In reply to Standards Compliance In W ...

Your post touches on a larger issue which I think we as a society need to consider. When you write
"...we insist that they sign a legally binding document..."
you are touching on a larger issue, rules and requirements for accessing the Internet.

I greatly favor seeing some rules for accessing the Internet. I favor requiring people to have a license much as having a driver's license is required before people can drive a car on the public roads.

In the case of web site standards there would also be a license. If a person were running a nonstandard web site then they would lose their license and their ISP would be required to shut them off. In the case of end user licenses the ISP would monitor their network for suspicious activity. When zombie machines or other malicious clients were identified they would be denied access to the network until the problem was corrected.

We impose requirements on people when they want to drive a car on the public roads. We should also impose requirements on people when they want to use the Internet.

We also impose requirements on the mechanical condition of cars that are allowed to drive on the public roads. We insist that they are regularly inspected to ensure that the brakes work and that the suspension is intact. We should also impose requirements on the integrity of the computer environment on clients that want to access the Internet. We should insist that machines that have viruses are denied access to the Internet. This can be done by watching the network traffic.

This could be implemented very easily. My ISP, Comcast, has a system where you have to register the MAC address of your cable modem. If you change cable modems, as I did a few weeks ago, then when you access the network via the web you are directed to a web page that says that you have to call their help desk.

This same kind of system could be used to deny access to clients that have been identified as being malicious. End users would be directed to call their ISP's customer service. When they do that they will be told what problem had been identified with their client and they would be told that the problem would have to be fixed before they could use the network.

The only problem with this solution is that it has to be implemented regionally. In other words every national government would have to pass laws to implement this solution. Naturally there will always be governments that will not go along. These would probably be the same governments that host the magority of the malicous clients now. Russia and China immediately come to mind.

Nevertheless I think that it makes sense to impose some "Rules of the Road" to Internet usage. We do this with other public resources for good reason. We should do this with the Internet as well.

Collapse -

so start

by Jaqui In reply to Rules of the Road for the ...

a lobby group to get laws passed?
you do realise that the usa is the single most invected area with zombie machines, courtesy of aol, don't you?

actually, go over your agreement with your isp. it already states that if you are running a server and it's infected, or you have an infected home system, that they will cut you off until you clean it up. they just don't bother enforcing it.

Collapse -

USA!!! ... USA!!! ... USA!!!

by stress junkie In reply to so start

" do realise that the usa is the single most invected area with zombie machines, courtesy of aol, don't you?"

Nobody does it better!!!


Collapse -

go ahead.. destroy the internet

by netgenner In reply to Rules of the Road for the ...

The Internet is a fluid technological environment where anyone with the skills and time can come up with better way of doing things. It is also the closest the world has ever come to having a global exchange of virtually infinite free information and communication. If you are going to impose fascist right wing looney controls like requiring a license to connect to the internet or put up a webserver then this will immediately destroy the biggest single intellectual advance in modern human history. You may as well then just put proprietary banking and shopping terminals into homes as that is all it will be good for. If some little software development company working out of a garage comes up with some new way of serving web pages that is 100x more efficient would you throw it out just because it doesn't strictly meet W3C standards (or they don't have a "licence") - because that is what the nightmare environment of fascist control will achieve - it will totally stifle innovation.

That some people have viruses and some browsers have varying standards and some users are malicious is simply a reflection of the complexities of the real world - part of what makes the Internet so valuable, and a challenge to developers and vendors to continue thrashing out the technology in a cycle of continuous improvement. IE has something like 99% of the browser market - it may flout the standards but if that's so bad then why are they so successful - and if almost everyone uses it then, taken objectively, it becomes the standard, rightly or wrongly.

Surfing the internet is not like driving a car which is a physical 2 ton piece of metal that you are hurtling down a road at 65mph. such analogies are very dangerous - different risks, responsibilities and benefits apply to different activities. You might as well insist that people should have a licence to use an oven because there are many bad cooks and cooking can also be dangerous. Where does it end? Should you have a licence to use the telephone? Maybe a licence to watch television? A license to travel freely? A license to have children???

It is not surprising that suggestions like this surface from time to time - the internet is primarily a free exchange of ideas, and that has always threatened authoritarian types as there is no way to propagandise, commercialise or impose morals on it.

Collapse -

but, while the concept

by Jaqui In reply to go ahead.. destroy the in ...

of free access to informatioon is what started the internet, using proprietary formats goes right against that.
if every site was coded to standards, then that guarantees the access to all, not just those that have the "right" tech on thier system.

non standards compliant sites hurt the core of free access to information far more than legislation requiring standards be met in site design.
those standards are designed to guarantee access.

Collapse -

Free Market Economics

by Raven2 In reply to but, while the concept

Hey, The answer is let Free Markets run the web. It is just so efficent and does wonders for diversity and getting the real story out to the world. We can have all those soooo helpful major media corporations set the standards so that we all profit.

The previous fairy tale was brought to you courtesy of the same folks who brought you the new hit "Dubia II" playing in governments everywhere.

Collapse -

Amen, netgenner

by Mikiel In reply to go ahead.. destroy the in ...

Amen, brother! It was a lack of imposed rules that allowed the Internet to grow so rapidly and well. It was a lack of imposed rules that allowed much of the technological innovation.

I guess some people found it scary and just mean-spirited to people with older browsers when evil developers perverted the pristine HTML world with "tables" and "graphics". How dare they mix layout with semantics! Not without long debate by an international committee, No-sir-ee! Left unchecked these deviants would be be creating interactive maps and RSS feeds. You need to be licensed or locked up, I say! Luckily the fact that less than 100% of browsers could support non-text features made those misguided ideas go nowhere and is why everything on the Internet today is displayed as text-only - and it's a lot better for it I might add!

('Though I must say that requiring a license to have children has a lot of merits..."

Collapse -

Can't guess where this thread got spawned from

by Tony Hopkinson In reply to Standards Compliance In W ...

Stuck between I didn't make the world I just live in it and WTF on this issue.
The most annoying thing about HTML and XHTML is the latter is much easier to parse and therefore process, but of course non-standard sites had proliferated massively before the XHTML standard was proposed.
Just try as a tech head to get the resources to convert all your sites, fat chance basically.

The only potential driver is a very popular tool for dealing with sites that is only XHTML compliant, but if it didn't support a very sizable portion of it's potential market how would it become popular.
Need a time machine to sort this one out, go back and give ourselves another chance to catch the boat.

Collapse -


by Jaqui In reply to Can't guess where this th ...

it's not actually anything new in my opinion. or attitudes.

actually, if the eula with service provders was being enforced by those service providers you would quickly get the resources to convert the sites, since the non-compliant sites would be taken offline. most non-standards complant sites are subject to viral infection, making them a violation of the isp's eula, and giving them grounds to remove the site. ( this specifically is the activex controls beingused, as well as the frontpage extentions )

Collapse -


by Tony Hopkinson In reply to well,

Now if we could get rid of client side code execution (sigh , but that is within the standards unfortunately. As for enforcing EULAs best of luck on that one, you've more chance of getting a windmill to coperate in a jousting event. They are to protect the people who write them , not those who click on the agree button. They are at best a bad joke, only slightly more funny than an ms standard.

Have to stick with Sun Tzu on this.

Related Discussions

Related Forums