Windows 8

Why HTML5 makes justifying native applications more difficult

Justin James, who is now very bullish on Web applications, explains what you need to consider when deciding whether to write a desktop application or a Web application.

One of the biggest shifts in development in the last few years has been the move to Web applications. For a long time, developers resisted this move, and some of the reasons why were good. For example, I said that for a long time the Web model wasn't so great -- the UI capabilities weren't there without a ton of work, and the ability to do "real work" was lacking. Some of the reasons were not so good, and mostly boiled down to a refusal to learn something new.

I have recently become very bullish on Web applications, and I now highly recommend that you consider them over desktop applications in all but a very few sets of circumstances. When you are deciding between writing a desktop application or a Web application, these are things that you should take into consideration.

When do you "have to" write a Web application?

Under certain conditions, you must write a Web application, regardless of what you would prefer to do, either for business or technical reasons. Here are some of those situations:

  • Zero footprint installation (Flash and Silverlight work well here too)
  • Software-as-a-Service (SaaS) billing model
  • Cross-platform compatible (Yes, you can use one of the cross-platform widget libraries too with native code, but Web is much easier.)
  • Server-centralized control of logic, resources (data, CPU, et al), etc.
  • Public facing application with highly sensitive data that needs to stay behind a firewall
  • Needs to communicate between the client and a centralized server outside of the client's network

Note that some of these situations have exceptions. That's fine, but keep in mind that the alternatives to Web applications in these scenarios are often no better and often worse in many ways than Web applications.

Mobile

In large part, the growth in mobile devices was the beginning of the end for desktop applications. You could count on being able to target Windows and hit 95% of the potential users. Now, so many people (especially consumers) use mobile significantly enough that it is hard to get complete coverage of the market with a native-only approach. To make matters worse, not only is the mobile market highly fragmented between the different mobile operating systems, but the #1 player in mobile right now (Android) is highly fragmented amongst itself. If you really need full coverage of the market without having to learn enough native development to cover the mobile market well, Web is the way to go.

Windows 8

Windows 8 is the straw that broke the camel's back for me. Pre-Windows 8, I would say that you shouldn't write a Web application unless you had to write a Web application. Despite the boom in mobile, it just is not enough to tip the scales for many apps, but Windows 8 is in my opinion.

With Windows 8 looming, the time is right to reconsider native applications entirely. Instead of asking, "do I need to write a Web application" you should be asking, "do I need to write a native application?" Why? Simply put, with the changes coming to Windows 8, if you want to write applications that need a lot of input and similar hallmarks of "real work," the Metro UI is not great for it, and if you don't want to use Metro, you are locked out of the ARM devices. Or to put it another way, unless you are writing an application that can work well as a touch application, a Windows 8 native application is a bad idea, and a "legacy" Windows application locks you in to the existing Windows market, which will only get smaller over time.

Do I need to write a native application?

That's a good question! Let's look at the common wisdom answer to this question. Typically, I would say "yes" if the following are true:

  • Needs to be able to work offline as well as online
  • Requires substantial access to the local system's resources, particularly CPU/RAM
  • Performance considerations
  • Heavy graphics work
  • Integration with other systems
  • Desktop-quality UI widgets

Up until HTML5, I would call that a really good list. HTML5, though, answers each of these in a way that makes it even harder to justify native applications. Let's look at that list again:

  • Needs to be able to work offline as well as online: There are provisions for enough local storage capabilities to handle syncing.
  • Requires substantial access to the local system's resources, particularly CPU/RAM: Web Workers gives you a development model to not hog the system and block the browser.
  • Performance considerations: Browsers' JavaScript engines have made huge improvements, especially Chrome's and Internet Explorer's.
  • Heavy graphics work: The <canvas> tag allows bitmap graphics manipulation.
  • Integration with other systems: WebSocket allows full-duplex, persistent connections.
  • Desktop-quality UI widgets -- Improvements to the <input> element and the addition of the <menu> element allow for a much richer UI experience; jQuery has continued to fill in the gaps.

As you can see, HTML5 offers compelling reasons to re-evaluate the decision to go Web or stay native. Indeed, other than certain specialized or niche scenarios (like integrating directly with a "legacy" Windows application), it is very difficult for me to recommend building native applications for typical consumer or line-of-business applications going forward.

J.Ja

About

Justin James is the Lead Architect for Conigent.

154 comments
terjeb
terjeb

HTML5 is (going to be) cool, and I am going to create quite a few of those, particularly in the typical in-house app scenario where you present data, give the user some limited ability to interact with and update it. HTML is already quite good for that, and 5 will improve it. For big complex apps, no way. There are a few reasons. Reason #1 is basically that there is no good MVVM framework for for HTML/JS. Whatever you think of Microsoft, the way they have done XAML/.Net is actually a very good way to develop apps. You design the user interface in a design language (XAML or HTML) and you have a View Model with all the (changing) data. The View Model is responsible for updating the View. The View Model is also responsible for updating the Model. This gives you really "dumb" interfaces, with zero intelligence. All View intelligence is in the View Model. Currently there are no such capabilities in HTML/JS. Hopefully that will change. The second thing is JS it self. It is an abomination, and efforts to improve it has stranded (partly thanks to Google). There are ways around this, Coffee Script and GWT are good examples. In that Scott Hanselman is correct. JavaScript is (should be treated as) the Assembly of web-based applications, but the actual code is (should) be written in a more suitable language.

aftonsmith
aftonsmith

I was very encouraged to find this site. I wanted to thank you for this special read. I definitely savored every little bit of it. Heathrow Minicab

xross
xross

I'm a lifelong web guy and I love the open browser, the DOM, Javascript as a language, etc. But my experience building a real HTML5 app in place of a desktop one has been really disappointing. Examples: - Graphical work in Canvas or SVG gets no hardware acceleration on mobile- which means it's horrible. - Local storage and DB standards are an inconsistent mess. - WebGL is nice but I had issues displaying basic text. As someone developing a product, I unfortunately had to give up HTML5 for Flash. Now that the whole world is fleeing Flash (right or wrong), I'm baffled by my next steps. I'm thinking of developing a web version and desktop versions in an embedded Webkit- though I'm worried that may be a headache. My app is presentation based which means: - it should run locally *Flawlessly* without fear of internet loss killing a presentation, - run on Mac, Windows, web, and mobile (in a stripped form) - Is *highly* graphical- even using 3D Flash/Flex/AIR can do all these. HTML5 sucks in actual execution here- despite the occasional neat looking demo. And many of the problems are due to browser inconsistencies that the main browser co's don't seem to be moving forward on (I'm looking at you Microsoft, Mozilla, *and* Google). Frankly, I'm frustrated by a tech world that keeps proclaiming "HTML 5 rocks, Flash and native apps are dead" while not actually fixing HTML5 to build real apps on it. Meanwhile, I'm the technical laggard for needing to build my platform on something that actually works. I must go know, I actually have a company to build and users to satisfy... Alex p.s. So I go to click "Submit Reply" and, having expanded the size of the textbox, that button is now missing under the comment below. This quirk times 100 is what my personal experience building a real HTML5 app was like. And, yes, I'm angry... :)

yuri.paez
yuri.paez

Please tell me how? previous versions of html really was an standard? how many browsers do we have today? the same browser works and support the same level of html for all devices and all OS's? I think that today we have less native plataforms that matter (windows, iOS and android) than browsers. For simple applications I think that is a good solutions. I'm agree that important data has to be ubicuous and completly accesible any time any where by the users, but you can do this by web services, consumed by native applications. But why don't get the most that your device can provide to you GPU, CPU, storage?

figgles
figgles

This is sort of like the "HTML 5 GAMES ARE THE FUTURE". OK, let's see. I write in C/C++ and video games. Let's see why HTML 5 is NOT appropriate for my latest game and why it will be a native app: Needs to be able to work offline as well as online: When I have 20GB of game content (e.g. Dragon Age), let me know how streaming this works. Even 1GB would be too much. How's that 2GB/month data plan working out for you? Requires substantial access to the local system???s resources, particularly CPU/RAM: Web Workers is a trashy solution to what is basically "threads". If you consider it a serious contender for "multicore" performance, you probably haven't really worked on performant code. Let's say you were too brick-headed to do in C/C++ so your program in single-threaded. Unspeakable I know. So you redo it in JavaScript using Web Workers. You utilize all 4 CPUs, but then you realize the code runs at 1/10th the speed of normal. So in effect, you've earned..? Oh that's right. Nothing. Performance considerations: See above. If you want performance, you don't use a web browser and a dynamic language. You use a statically typed language whose execution model mimics the underlying machine. What do you do when your code using 20 Xeon cores takes minutes to complete? Port it to HTML 5? Oh please. What if you need your GPU to execute code (PhysX from Nvidia, anyone?) to progress the game? You cry in a corner because there is no PhysX or anything else for HTML 5. Heavy graphics work: HAHA. Bitmap graphics manipulation? Dude, that's so 1990. You could at least mentioned WebGL or something, but even then, WebGL is mostly being used for "faster 2d!". As in turns out, you can do some cool stuff with WebGL, but you can't do nearly as much cool stuff you can with say, DirectX 11 or OpenGL 4.x. The reason is obvious if you subscribe to the WebGL mailing list: they want to broaden the scope of hardware support to include mobile hardware. They leave out basic stuff like "texture compression". In effect, you've got an API targeting hardware from 2002. Nice. The language itself (JavaScript) cause a number of hilarious hacks in API in the name of performance. Why? Because graphics cards require stuff to be laid out in certain formats, which require explicit control at the byte level. JavaScript is entirely inadequate for this. Integration with other systems: WebSockets is a joke. It is TCP-based system (which works for less demanding games) which allows connectivity over port 80. It multiplexes traffic over your connection. It allows for 'basic things', but I wouldn't expect serious traffic like an MMO or FPS to work. I guess you could make the most killer chat room ever though! Desktop-quality UI widgets ??? As in turns out, most game developers DON'T want this. The reason is that mixing native Win32/X11 stuff with hardware accelerated rendering causes serious performance problems. Guess what this is doing?

Parrotlover77
Parrotlover77

Replace HTML5 with "Java" and this article could have been published in the 1990s. Remember the Java Desktop that was going to kill Windows? Me neither. Once all the hype dies down, all that will be left are just the modern version of "Web 2.0" apps (from a few years back) for quick work and software-as-a-service / web services consumed for local applications. I think there's a bright future for entertainment apps in the cloud / on the web. Time has proven it's a business model that works. But the biggest players in the world (Google/Microsoft/etc) have had a decade now to bring us a decent basic "work to be done" web app, like a Word Processor, and it just absolutely pales in comparison to its local brethren. Local apps aren't going anywhere any time soon. And I haven't even brought up the privacy concerns that others have.

gskern
gskern

I am likely to hear groans when I say this, but I took the time to read this whole Thread and I just have to make this point: Citrix has been combining the best of most of these (all very good) points for years now: The security of keeping Apps and Data inside the organization, the very-small-Packet-size data stream to the User - who can hit this "cloud" from any browser - and the ability to work Offline in case the connection is dropped, complete with Sync once it comes back up again. Newer technologies like App Virtualization and User Virtualization have only added to the Citrix just-about-the-best-of-all-options approach... I hope the whole *concept*of anything like a "desktop" (i.e., "My Computer", "Recycle Bin", wallpaper, yada yada yada) is dead or dying; business Users need Apps, Data, and their Settings, and that's about it.

bill.hannah
bill.hannah

I think while your arguments have some merit, they are a very narrow view of applications. There is a lot to consider when deciding to go web or native, and HTML5 only has answers to some. Also, web-based apps by definition have to target the lowest common denominator. So do you code your app to be able to run on a device with a 500 mHz processor with no dedicated graphics hardware and a tiny amount of addressable RAM when most users will be using a Quad-core CPU, 16GBs of ram and powerful dedicated graphics hardware just so you can tick a supported platform box? It would make more sense in some situations to create native apps for each platform that can take advantage of the platform's strengths & weaknesses. Also, currently Canvas has its problems. Canvas has a powerful API, but it's an abstraction of the lower-level APIs and nowhere near as efficient. Have you tried using Canvas yet? A trivial thing such as drawing a graph can lock up your iOS or Android device and take over a minute to render. You get much, much better performance by using native graphics APIs. You'll never get a powerful graphics app to run well in a web app on constrained devices, so why limit yourself? An other example of a bad choice (currently) for a web app is an IDE. Have you ever tried to use an IDE web app? Every one I've tried sucks. We're still years away from having something as usable as current IDEs in the browser. I think the most important thing to remember with web-based applications is that we're still very early. We're still learning the pros and cons of web-based software and it is in no way a silver bullet, and probably never will be. I think the most important things to think about when deciding whether to go native or cloud is: 1. Infrastructure: do you even have the ability to host/pay for a web-based app? Is it going to cost you more than distributing a native app? Can you afford to be popular? Do you have enough storage for user-generated content? 2. Target audience and usage scenarios: Will all your customers use the application in the same way on all platforms? Will your customers need to be offline more often than online? Will they share data between devices and applications or will it only live on 1 machine? Can you make an interface that makes sense using both touch and keyboard/mouse? 3. File storage & privacy/ownership: Will your users be producing files? Will these files need to be available to other applications? Will your customers trust you with these files? Can you keep these files safe? 4. Prior knowledge: Do you know how to do it? Will it take you a significant amount of time to figure out how to implement this on the web? Will you be able to get a return on this investment? The debate over thin client or thick client has been going back and forth since the early days of computing. Currently we are swinging towards thin clients but in 5-10 years we will be back to thick clients. The reason for the constant shift in popularity between the 2 styles is that each has its advantages and disadvantages, and the current popular choice is based on recent advances in one or the other. Thick clients will always be around, and so will thin clients. The best advice when choosing is always where the current state of the art is, and whether it satisfies your needs and the needs of your users without unacceptable compromises. And just for clarification, I am a web developer, and have been for a decade. Web capabilities have certainly grown over the past few years and can replace many traditional native apps, but it's not the right answer for many problem domains, and probably never will be.

ketlux
ketlux

"Do I need to write a native application? That???s a good question!" No it isn't, the quesiton is "Why do I need to change my development environment?" You don't seem to answer that question.

sysop-dr
sysop-dr

A web app is pretty much second fiddle if the target audience is confined within the network you are working on. For applications that do much math(scientific at the least), handle large amounts of cash, or are the internal pieces that encapsulate your companies IP then a web app can never be used. Ever! You can not do the math needed for most scientific apps in anything but FORTRAN and hope to get it working well, especially if the developer is not a professional programmer. These chemists and physicists know FORTRAN and it's just not web friendly. besides in most cases the formula is proprietary or a secret and so putting it into your java script on a web app would just be nuts. Cash, it's OK to have your users risk their life savings on use of a web app but do that for your company and you are on your way to regulator hearings. Don't do it. And while so many people think that the only thing done on their computers is surf and email most computing is still either big companies doing the books OR industrial process and neither of those should ever be on the web. If a look at security says hey this might be an issue then it should never be put on the web. I really hope I never have to point this out to people but apparently I do all the time. Security must trump ease of use if the security classification of the people involved requires at least a police records check. That doesn't mean you can't make it a web app, but it does mean that you must ensure the security of the data and access to the app. Force secure protocols, force logins, use secure site certificates and have your software check the certificate used is yours, three login fails lockout the user. Do double checks in the software for man in the middle attacks (yes you can do this.) Use secure services and have the security of the software checked by an independent team, think security at all levels of development from writing the requirements to maintaining the system. Never deploy a product for your users that you can not get standards based QA from the vendor. Be careful out there.

Slayer_
Slayer_

Far as I know, browsers are not able to write anything more than cookies. What about the fact that an alarming number of people are on dial-up or have very expensive cell phone plans. We should be making websites smaller, not bigger.

belli_bettens
belli_bettens

to pick in on your MVVM example: image the dumb view being your html5 app and the model being placed somewhere on a server through which you communicate via a (secured) webservice. Isn't that the exact thing you need? And it's already out there! :-)

yuri.paez
yuri.paez

My coment was in the same way that you... why to make a change for a suposse to be standar when never in the past HTML can be a real standar. Today is going to be diferent? Who is behind this "standards" the same big guys that have his own plataforms and have his owns browsers (Microsoft, Google, Apple) and each of this make his own implamentation for html. Do they really have an insentive to follow an "standard" I think is better to accept that there is not going to be a silver bullet. I'm angry too becouse I think that all this of HTML5 is an scam from the big guys and is going to be the bigest time lost for us the developers. My advice, use the tools that you have prove that work today for real apps.

apotheon
apotheon

Damn, that sucks. I'm rather annoyed by this state of affairs myself, mostly because we really really need something more open and standardized and free of vendor lock-in -- but it would be awfully nice if it friggin' worked. Well, shucks. It looks like we're just as screwed with the open tech as with the closed.

Justin James
Justin James

Previous versions of HTML were bad standard because they were vague, poorly defined, and it was possible to implement "to spec" and be entirely different from someone else's implementation which is also "to spec". HTML5, though, is a much more bulletproof spec, the vagueness is gone and part of the spec begin considered "finished" is the creation of test suites to verify compliance. J.Ja

Justin James
Justin James

I certainly didn't. The article was clearly about "applications". I wouldn't dream of writing a serious game like an FPS or MMORPG in HTML5 any time soon. A MUD-like game? Sure. Farmville? Yes. J.Ja

Slayer_
Slayer_

How can they be coded into HTML5? Let's see Epic Battle Fantasy 3 coded in HTML5.

Justin James
Justin James

Like the Apple Newton, Java Desktop suffered from being the right concept long before the hardware was ready. If we had mobile and thin clients with the horsepower than the typical smartphone or tablets do at those price points, the Java Desktop would have worked. Oh wait, the Java Desktop DID become a success! It's called "Android". J.Ja

belli_bettens
belli_bettens

1. A html5 app is not necessarily the same as an online app. 2. A html5 app can detect the user agent (browser) and provide different UI's/functionality. Additionally, it provides offline storage. 3. That is indeed still missing in html5 (for security reasons). But the more something becomes desirable, the sooner it will be implemented. 4. Being reluctant to change won't get you any further. That said, I understand your concerns but I think a paradigm shift is at hand. I, for one, welcome the ability to create an app that can be used on any device that is out there. And I admit that html5 is not yet at a point where it is usable for any purpose but it is heading in the right direction. We are now in an environment where people want all their data on all their devices so I don't think it's wise to limit yourself to one platform. You must admit that it is a great thing to develop once and deploy anywhere. This is an evolution you can't stop, in the end it's the user that decides how he wants to access and use his data. I believe that by stating (as a developer) that you won't follow this change, is holding yourself back. And we all now what happens to developers/companies that are reluctant to change...

belli_bettens
belli_bettens

so you can reach a bigger audience. Good luck with deploying your native iOS app on an android device? (so yes, he did answer your question) Things do not change; we change. Or would you rather still be programming in COBOL?

AnsuGisalas
AnsuGisalas

I thought we were just looking at HTML5 client interfaces for non-local apps... is it inconceivable to have a standard HTML5 app which serves as a bedding for any kind of Fortran program, allowing secured access from the user machines?

Justin James
Justin James

Parts of the HTML5 spec are underway to allow enough local storage for an offline sync to be useful. This is why Google dropped Gears. J.Ja

terjeb
terjeb

Of course the Model is on the server, that is what MVC is for. My controller and my model is on the server. My View is in the browser. Some times however, my view is very, very complicated. Think something like a "wizard" interface with 5 steps or more and a plethora of paths through that "wizard". If I run back to the server constantly to get View information, in other words, what user input screen to show next, I have created an application that will not scale. It will also not be anywhere near as useful as a desktop application, since the user can not leave the app in the middle of the "wizard" and come back at an arbitrary point in the future to complete it. I am less concerned about #2 though. Scalability is my main concern, and complex client applications with server-side state simply do not scale. Don't even think about it. In such cases I need to create an intelligent client app. One that can keep state. One that doesn't have to go back to the server for everything. Then I would shoot my self if I had to do it in JavaScript. CoffeScript improves the situation. GWT makes it a lot better. We're still nowhere near the ease of development you get with Flex and Silverlight though (who both have obvious drawbacks - and no, Microsoft killing SL is not one of them since they are not).

Justin James
Justin James

I spent a number of years on the HTML5 Working Group, and I had a lot of insight into what was going on. 1) the HTML5 spec is actually specific enough that implementation MUST agree (right now, the majority of the "I'm angry because browser XYZ isn't compliant!" is actually incorrect, the HTML 4 spec was so vague, you could be "compliant" while doing things totally different from another "compliant" browser... and the "test suites" like ACID aren't provably in agreement with the HTML 4 spec either!) and 2) there will be proper test suites from W3C itself to prove browser compliance. Furthermore, all of the major players have a very deep commitment to HTML5, the standards process, and getting things right. If you think that they have no incentive to do this right, you are mistaken. All of the big players (except for maybe Apple) have a massive commitment to the Web. Microsoft is realizing that their money is going to come from selling servers and development tools, not client apps; they've already retreated on Silverlight because of HTML5. Google is 100% Web. Adobe is slowing pulling back from Flash in favor of HTML5. J.Ja

Justin James
Justin James

All of the problems discussed are issues with the browsers themselves, and that's the kind of stuff that vendors are working hard to fix. For example, note that some browsers (IE pops to mind, but i suspect Chrome may be there too) are already doing hardware acceleration of other things, so it is likely that it's happening in IE9 on WP7 (I know, not exactly a big market there)... no reason why Chrome on Android couldn't do it either, if Chrome on the desktop already is. These are the kinds of things which provide critical differentiation both for browsers and devices (will Windows 8 in "Metro" mode even work with a non-IE browser?!?! I need to find out!), and as a result I'd expect to see lots of improvements. J.Ja

seanferd
seanferd

Maybe you should define what you mean by "applications". Then I can retract any comments I've made about sweeping generalizations, and a lot of other stuf which, while still true, would not apply to your article.

dogknees
dogknees

It's a pretty fine distinction to make. What other software would not come under the title "application"? Particularly when there are business focussed "games" that are used to train staff.

figgles
figgles

I'm sure one could, but there is sunk cost. Your Flash code base is already done, why bother rewriting it? Just because you can, doesn't mean you should. Besides, who cares. 2D games are peanuts in complexity to games like Skyrim, which by the way, will never be HTML 5.

terjeb
terjeb

>> Or would you rather still be programming in COBOL? It kinda beats JavaScript

apotheon
apotheon

. . . but, honestly, screwing around with pretty interfaces is kinda beyond the scope of such programming in most cases, I think.

Slayer_
Slayer_

Then any application can start storing executable code for any number of hacks. Creepy.

xross
xross

That's good to hear. I don't have any insight into the current process, but from the outside the current state of things like offline storage and DB looked a little unfortunate. Thought it was a typical case of standards wars. Any insight as to why many companies have become so anti-Flash? I totally get why they want to remove Flash overall but it seems like vendors are looking to yank it (like no Metro Flash capability) before the replacements in HTML5 are standardized and rolled out. Not to mention, while I am a fan of Javascript overall, there are many, many things that are not present for it to be a true *application* framework. JQuery is great for progressive enhancement, but we're talking about building full apps here. BackboneJS helps but it certainly not a standard. For example, CSS layout doesn't have the variety of layouts most UI frameworks such as Swing, SWT, .Net, etc. do. Fine for a web *page* not ideal for a web *app*. And there isn't anything that I've seen in the works. Also, graphically HTML5 is still short on things like a display list that Flash does. This isn't a "nice to have" but a basic for real graphic development. Yes, there are individual, one-off 3rd party frameworks to address this. But Javascript, with it's global namespace can become a monster when you pull in other folks' frameworks. Regards, Alex

Justin James
Justin James

... where I lacked it with HTML 4, because of what I saw on the HTML Working Group. All parties involved are working VERY hard to make a "common ground" standard. The standard is extremely provable and can be tested against. That is key for seeing enough compliance for things to really work well. Will it take some time for the implementations to catch up? Yes. HTML5 is the primary motivator behind the accelerated Firefox and Chrome releases. And yes, things change so much but IE is only now being moved up to a years release cycle, so it holds stuff back (on the other hand, IE doesn't release an HTML5 feature until it's pretty locked in stone, which isn't a bad thing). All the major players are extremely... EXTREMELY committed to the standard process and to the standard itself. The way the spec is being written, no one is implementing a feature out of spec and having it released long enough to make their commitment to a bad implementation permanent. My only real complaint, is that I think that the editor working for Google has let Google have far too much input into it. Not that he's excluding anyone else, but if you pay close attention to the process, the "prefer working code" mentality meant that Google's engineers could shove stuff in his face and say, "hey look, working code!" dump it onto some Git repo somewhere, and that would be enough juice to get what they did into the standard instead of someone else who may have been working more methodically and cautiously to do the best thing possible. Basically, Google hijacked the process in a lot of way, but at the same time, it also moved it forwards. J.Ja

apotheon
apotheon

That's sorta my point: even though it may be the tech we need, in principle, it's awful in practice. We're screwed either way.

Justin James
Justin James

... says, "... it is very difficult for me to recommend building native applications for typical consumer or line-of-business applications going forward." It probably should have been made more obvious in the article, though, for clarity. J.Ja

apotheon
apotheon

JavaScript gives you a Zippo, but makes you wear mittens while trying to use it. COBOL gives you a pair of sticks to make fire, but requires you to use a Rube Goldberg device with the sticks at the other end, and forbids you to touch the sticks directly even when trying to reposition the sticks in the device.

Sterling chip Camden
Sterling chip Camden

... similarly to how an Ogiek beats most first-world inhabitants: he can make fire with nothing more than two pieces of wood.

apotheon
apotheon

Even when working on end user apps, I generally develop the core functionality as a library, and create an interface (or two) that uses it. My reference to library APIs, though, was specifically aimed at cases where "screwing around with pretty interfaces is kinda beyond the scope of such programming", though. . . . and I'm not sure where I'm getting downvotes for suggesting there are cases where pretty UIs are not the primary purpose of software development. What's up with that?

dogknees
dogknees

I don't generally develop libraries and the like. I develop end user applications.

apotheon
apotheon

Since when do things like library APIs need user interfaces for common usage?

dogknees
dogknees

The interface is often the most complex part of development. If programmers don't create the interfaces, who does? I don't mean "design" the interface, but create it. ie Write the code that makes it work as designed.

dogknees
dogknees

Fine, it no longer has the original meaning. So, what word do you suggest we use for that which hacker used to mean? That's the problem, with the word goes the concept and people forget it exists.

apotheon
apotheon

That's pretty much what I recommended.

Justin James
Justin James

... I almost never use "hacker" in this sense anyways, and I probably wouldn't have if the person I was replying to hadn't used the word "hack" (or something along those lines). I never liked "cracker" or "hacker" since they are essentially devoid of intent for the most part. In my Patch Tuesday piece, for example, I *never* use the word "hacker", I keep things obvious with words like "attacker" and "malicious user" and other such terms that correctly convey the intent. After all, if someone is a legitimate security researcher, to most folks what they do is "hacking" (or in tech circles, "cracking"), even though we applaud their actions. So... I'll just stick with the words I typically use anyways. J.Ja

apotheon
apotheon

. . . unless you just chose to ignore them as inconvenient. Nowhere in this response do you actually address any of those points or behave in any way as though you have considered them and accounted for them. Every one of my points stands unaddressed, without any disputation of them -- they are only dismissed without comment. Despite that, I will continue to substantively dispute your points. Logically, this results in me "winning" in any judgment of valid debate, but of course that's completely without practical meaning in a case like this, where I'm trying to counter willful ignorance, and your refusal to either read/understand or just acknowledge the points I make is a continuation of that willful ignorance, resulting in me "losing" in the pursuit of my actual goal. Without further ado, my disputations: "if I'm talking to someone like you, if I use the "common usage" of "hacker" you know what I'm talking about anyways" . . . sometimes. Sometimes, I do not, at first. It is only when you say something that is very distinctly in contradiction with the meaning of "hacker" that I must necessarily realize you are abusing the term that way. In fact, part of the reason that I know what you actually mean as often as I do is that I know something about what you know; if you were a complete stranger to me such that I would have no idea what you really think about any of this stuff -- as is the case with many of your readers, including those readers who know what "hacker" actually means in proper usage -- there would be more common misunderstandings. Even aside from that, this goes beyond you. The more you (like anyone else who does so) choose to abuse the term, the more you effectively encourage others to do the same, including those who might say things that are very ambiguous or judgmentally indistinct such that they lead to confusion. "if I'm talking to a non-tech person, and say 'cracker', at best they don't know what I'm talking about, and at worst they think I'm tossing around ethnic slurs." This is why I use a complete, denotatively obvious term: "security cracker". Who the f[...] do you think would fail to understand that, and mistake the meaning for an ethnic slur? The term "security cracker" (or, more explicitly when warranted, "malicious security cracker") is blatantly descriptive, and does not conflict with any other usage that has a different meaning. It's disambiguation, rather than conflation -- and, no matter how much you protest to the contrary, conflation is exactly what you're doing when you misuse "hacker" to mean "malicious security cracker". "Even in tech circles, 'hacker' is increasingly the common parlance for 'someone trying to get through security'." Yeah, it's increasing -- because of willfully ignorant facilitators of the confusing circumstances of such use such as you. Good f[...]ing job. "If I can't solve the problem all by myself, I'll become part of the problem." It's like trying to vote for the "lesser evil". When you vote for evil, whether lesser or greater, all you get is evil. "As a result, I stopped using 'hacker' in the 'proper' sense a long time ago." I don't use it in that sense, either, except when it is obvious what I mean. That's because I prefer to be unambiguous. What I don't do is misuse it at all -- because even when misusing it in private with someone who only understands the misuse, I'm reinforcing erroneous usage so that other people will then go on to use it in circumstances where insult or confusion might result, and I sure has hell don't abuse the term like that in public fora where I'm broadcasting to a mixed audience, resulting in confusion, reinforcement of bad habits, potential insult, and digressions into huge off-topic debates over the correct usage of the term. Once again, good job. If I did not think you were potentially salvageable, I would have just commented on it in a way that brooked no argument, pointed out some other writings, and moved on. I'm beginning to think you really are so unsalvageably willfully ignorant on this subject that I should quit wasting my time on you, though. "I *can* use the 'wrong definition' of the word 'hacker' with zero confusion other than some syntactic debate amongst people who know what I'm talking about anyways" . . . then you should damned well do what I do: use a more explicit term for security crackers (i.e., "security cracker"), and avoid using "hacker" or the abbreviated "cracker" except when circumstances very clearly warrant it. Your depiction of the situation as a black-and-white choice between confusing mobs of ignoramuses and confusing smaller numbers of literati is a classic fallacy of a false dilemma. Take the third road: clarity. . . . or not. You can just continue to make excuses for intellectual laziness and willful ignorance until that habit bleeds over into every technical subject you address, outside of the potential for a couple of very narrow fields where you really give a crap about correctness at all, and you ultimately find yourself unable to maintain friendly acquaintance and ongoing exchange of knowledge with people who care about correctness in a broader selection of subject areas -- people like Sterling and me. "Sorry guys." You're obviously not sorry about anything. You choose to join the march toward ambiguity and ignorance, rather than at least having the intellectual interest in standing outside the rush if you refuse to fight it. You could avoid abusing the term and still say things that ignorant readers will still understand, but you obviously don't give a crap, and insist on wasting far more effort coming up with excuses for abusing terms than you would have to spend on being clear and unambiguous to all parties. edit: Yes, that was my downvote, and after this I hope you are quite explicitly aware of why you earned it.

Justin James
Justin James

... where common usage trumps the "right" meaning and redefines the meaning. I think that in the last few years, "hacker" is one of those words. Indeed, if you look through my posts since forever, it is only in the last few months that I switched to the common usage. It was a deliberate decision, to facilitate communications. You see, if I'm talking to someone like you, if I use the "common usage" of "hacker" you know what I'm talking about anyways. But if I'm talking to a non-tech person, and say "cracker", at best they don't know what I'm talking about, and at worst they think I'm tossing around ethnic slurs. Talk about not facilitating clear communications! Even in tech circles, "hacker" is increasingly the common parlance for "someone trying to get through security". As a result, I stopped using "hacker" in the "proper" sense a long time ago. If anything, THAT is the real miscommunication, because outside of an increasingly small circle, it has a different meaning. So... if I can't use the "proper" meaning of "hacker" without confusing people, if I can't use the "proper" term "cracker" without confusing (or even offending) people, and I *can* use the "wrong definition" of the word "hacker" with zero confusion other than some syntactic debate amongst people who know what I'm talking about anyways... then it is pretty clear to me that the "common usage" of "hacker" is going to trump the insider definition. Sorry guys. J.Ja

Sterling chip Camden
Sterling chip Camden

...should use the term "cracker", if for no other reason than to be unambiguous. Perhaps it will catch on. Justin, when your mother calls her big box the CPU, you don't follow suit do you? It's OK to tolerate other people's abuse of terms, but why make the problem worse by participating in it?

apotheon
apotheon

In rough translation: "I know the word I'm using is wrong. Other people use it incorrectly. Therefore, I will use it incorrectly, even though I could use a different term that is more correct without breaking a sweat -- a term that everyone would understand, would be completely unambiguous (unlike the case of using the wrong word), and would not insult anyone. I don't care about correctness, avoiding ambiguity, and all that other stuff. I just care about using the same term as the cool kids." Screw that. It doesn't make any damned sense. You [b]know better[/b], but refuse to [b]do better[/b] for reasons that [b]don't matter[/b]. Good luck with that dose of willful ignorance. Criminy.

Justin James
Justin James

... I know the formal difference in our little community. As someone who deals with average people on a daily basis, I know how others use the word. If the world wants to use the word "hacker" differently from the way you want to, there isn't much you can do about it, and I'm certainly not going to make a crusade out of it either. When I help my mother with her computer, I let her call "the big box" a "CPU" or a "hard drive" and move on with my life too. :D J.Ja

Sterling chip Camden
Sterling chip Camden

There is a significant community of code geeks who still use the term "hacker" strictly according to the "original" definition. For that community, it still has that meaning and not the other one.

Justin James
Justin James

... through improper usage. That happens to lots of words, but in this case, folks seem to hold the word "hacker" in some sort of special status because they choose to apply it to themselves with the original definition, and feel maligned because the common usage has changed. J.Ja

apotheon
apotheon

None of you are even using the term "hacker" correctly.

seanferd
seanferd

Your service providers. Creepier than a lot of malicious hackers.

Justin James
Justin James

It's sandboxed off like your cookies. If the hacker can magically get code stashed in that secure location to run, then they are already arbitrarily running code anyways, so its a moot issue. J.Ja