Like many of you, I saw Star Wars this past month and thought, "Man, it's good to be back among the lightsabers, Tie fighters, droids, and of course, the Millennium Falcon!" It didn't feel anachronistic to return to the Star Wars universe, even though it had the same elements as when I was six years old.
Some old-school technology (both on and off the big screen) will always remain fun and interesting. But other technological elements have worn out their welcome and need a swift kick to the curb. This article looks at 10 examples.
Now, this list is subjective and I can't promise all these things are headed for the dustbin of history. But I hope so. As a system administrator and technology writer, the problem wasn't finding 10 examples. The problem was narrowing the list to only 10.
1: Website frustrations
For a society that is becoming more and more reliant on web browsers, we sure do have our share of hair-yanking experiences dealing with them.
Constant updates. Plug-in woes. Websites that require you to log in to proceed to view an article but then take you to the home page when you comply. Browsers that don't save credentials no matter how many times you tell them to. Cumbersome site registration with illegible captchas and the requirement to wait for an email to complete setting up your account. (This is getting a better with the opportunity to log in with social media accounts, I'll admit.) The necessity to log your account ID and password somewhere or else use the same one you use everywhere—which is a BIG no-no. The list goes on.
Oh, and while I'm on the topic of aggravating websites, why is Yahoo still around?
2: Rude online behavior
Yes, I know I said I would leave out SOME of the things that are here to stay, but online etiquette needs to get better before we all devolve into a pack of snarling hounds.
In the first place, anonymous comments need to go. I understand they may have some merit when posting information from behind the curtains of oppressive governments, but using a handle like "YerMomDoesMyDishes" to sneer about someone else's hard work is strictly 1990s. Many sites are requiring valid human accounts or eliminating comments altogether thanks to the silliness in the feedback section.
Second, deliberately malicious endeavors intended to thumb your nose at strangers have had their day. Some people were posting spoilers about the latest Star Wars film on Facebook for no purpose other than hoping to ruin it for people. I myself glimpsed three key facts about the film thanks to individuals who thought it would be funny to leak surprises. No reason for that other than sheer immaturity.
And then we have the hoaxes. Always with the hoaxes. Back in the 1990s people were breathlessly forwarding emails insisting Bill Gates would pay you $50 if you did the same—and just a few weeks ago they were posting absurdities on Facebook about how Mark Zuckerberg would give away stock for copying and pasting a chain letter. While well-intentioned, this stuff is always prefaced with something like "Hey, it couldn't hurt to try!" It DOES hurt when people don't research facts and just repost wishful thinking; it wastes time, energy, and computing resources. Snopes.com is a wonderful resource for debunking myths, rumors, and urban legends, and there are many other resources out there just a Google search away.
3: Large companies getting hacked
I'm not going to name names, but if you have billions in assets you have no excuse getting hacked. Hacks aren't some inevitable force of nature; they are preventable with the right controls, analysis, and measures in place. I'd like 2016 to herald the end of large corporate behemoths being taken down like Goliath by silly measures such as insufficient security regulations, unsecured terminals or devices, or exploited vulnerabilities found in outdated software.
Now, 2015 did involve a bit of schadenfreude when it came to some big targets getting poleaxed. I can't say that I feel too bad for Ashley Madison users, who perhaps learned a valuable lesson in their quest to engage in or promote adultery; hackers obtained the customer list and threatened to expose users. But the lesson should be twofold for companies and users alike: We're not at a point of foolproof security yet. Maybe we will be in 2016—though security is more a journey than a destination—but details remain murky.
4: Java headaches
There are two kinds of Java: one that increases your stress levels, gives you the jitters, and makes you hopelessly dependent. The other is the hot caffeinated drink we all know and love.
I'm singling out Java since it represents by far one of the most backward technological problems out there. So many websites and programs rely on Java... but it has to be the right version. There are plenty of "Allow access" hoops to jump through if your browser even thinks you're doing something remotely unsafe, and you usually have to jump through these hoops repeatedly. I once administered some Citrix Netscalar appliances that demanded a specific Java level or else they wouldn't run properly (and which would threaten you that future updates would block said access).
Yes, Java has an auto-update process like many other apps, which is supposed to help alleviate these issues. But this is often kludgey or unreliable and can wreak havoc on programs it previously played nicely with. Older versions aren't removed when Java is updated, and these invariably end up with gaping security holes.
In short, we are told that billions of devices run Java, as meanwhile there is much drama and hand-wringing about the security problems it presents and the difficulties in getting it to work right.
5: Vendor manipulation
It's long been a mantra of mine that the customer drives the vendor, not the other way around. So why are vendors attempting to herd consumers when it should be the consumers setting the pace?
I'm going to use Microsoft's push to get people to Windows 10 as an example. It is aggressively promoting its newest OS, even by putting an upgrade icon in people's system trays that can't easily be shut off or remove—or actually downloading Windows 10 onto computers without being given the go-ahead.
I tried Windows 10 and had a "meh" reaction, especially after my VPN client stopped working, and went back to Windows 7. Now, I'm not a wet blanket; it's my job as a technologist to stay on top of current trends and never fear change. But it's also my job to find the tools that work the best for me and define my own methods of productivity. There is nothing lacking in Windows 7 that has me pining for any solutions offered by Windows 10. And yet we are told that Windows 7 is frightfully insecure and we should stop using it. It reminds me of TV commercials from the 1970s whereby "new and improved" products were showcased alongside the original products and we were told how poor the original product was by comparison. My reaction, even as a child, was to think: "Then why did you tell us that product was so great a year ago?"
It was different in the past, when it came to operating system upgrades. Going to Windows NT to 2000 was a must for better stability and resource utilization. Now, the OS is more a matter of choice.
Oh, and one final word on vendor manipulation: Please stop bundling apps like McAfee with other apps so that unwitting users won't install unnecessary software!
6: Removal of choice
This goes hand-in-hand with #5 but deserves to be a separate category. Having choice in a product means the ability to expand or customize it to better suit your needs. Recent smartphones from Samsung have shipped without micro USB slots or replaceable batteries (neither of which Apple ever offered). As someone who uses lots of micro USB cards and goes camping for days at a time, this rendered the Samsung Galaxy 6 a no-go for me.
Removing choice takes away a lot of the fun for the user. I realize Apple's goal is a complete soup-to-nuts ecosystem that reduces as many variables as possible to implement a more predictable and stable environment. However, this can lead to dread, with iTunes synchronization woes and frustration that only the "prescribed" way to get music onto the device is permitted.
Steven Levy's brilliant book "Hackers: Heroes of the Computer Revolution" touts the heady days of exploration and experimentation in the 1950s MIT technology realm. Read this book and you too will get a feel for the sheer excitement of learning how stuff works by being allowed to tinker. Removal of choice means limiting the user. Some users don't mind. Some even WANT a minimal set of options. But for others this is a buzzkill.
7: Lost data
I was approached several times this year by friends who had lost data when their hard drives or operating systems fizzled out and wanted to know if I could recover the information. To which I responded, "Why weren't you using Dropbox?" Any cloud storage solution, such as Google Drive or Box, could also suffice of course. It's inconceivable to me that something as easy as this hasn't become universal.
Operating systems, hard drives, and the devices themselves should be seen as finite and disposable, like vehicles on the highway. It's the passengers that they carry—namely, the data—that are valuable and irreplaceable. Let's hope 2016 sees the last of questions like "Hey, do you know how to recover files off a dead hard drive?"
8: Password hoops
Passwords need to die. In this age of fingerprint, facial, and voice recognition (The 2015 film Mission Impossible: Rogue Nation even featured a gait-analyzer to see how people walk and compare against a known metric to ensure their identities!), all the rigmarole associated with creating and changing passwords—never mind the endlessly onerous requirements—should be extinct.
And don't even get me started on the logistics of storing passwords in a central database that must be protected by, you guessed it, yet another password, then made available on any devices on which you might need to use these passwords.
9: Outdated or incompatible technology
Just this morning my wife texted me and said our printer wasn't working. Again. Despite my replacing the toner cartridge the other day. This is hands-down the biggest headache in technology; babysitting cranky printers that are perpetually out of ink. More so even than fax machines or poorly powered batteries, printers should be a thing of the past, what with digital newspaper subscriptions, PDF files, ebooks, and the ubiquity of portable media devices.
However, in my view there are many other candidates for the museum: digital/video cameras, MP3 players, CDs, DVDs, radios, cassette players, calculators, alarm clocks, and vinyl/turntables. Your mobile device can perform many of these functions and more. I even read a news article recently that typewriters were "coming back" and a nearby shop was devoted to refurbishing and selling them.
Now, I realize stated in #6 that removal of choice was bad. I myself feel it's a waste of time to haul around audio cassettes and DVDs or—heaven forbid—to type documents on an IBM Selectric like we used to do in college in 1990. In my view what's different here is that this technology represents an anachronism—a step backward in terms of convenience and clutter, that increases waste—rather than a step forward with new capabilities.
10: Manual intervention
All too often achieving our computing goals requires manual steps. I sync my smartphone pictures to my Dropbox account, but I still have to remember to go in periodically and delete the images from my Samsung lest it fill the drive. (I do the same for my wife's iPhone, which backs up to iCloud but doesn't need to keep those pictures permanently when they can be stored elsewhere.)
Similarly, at the end of every year I offload my project work into an Archive folder, which I then make sure gets backed up to an external hard drive so I can start the New Year fresh.
Getting my ebooks and music synchronized across my devices can be a headache. I prefer to use local storage, as I often go off the grid on camping trips. Copying data to my Android via a USB cable is horribly slow; much slower than it was even on the Blackberry I used to use. It times out often and I have to unplug the cable, then plug it back in. I tried a wireless sync program but it ran into too many issues to serve me reliably.
We need better automation of repetitive processes for the background stuff we shouldn't have to manage. True, there are scheduled tasks, automatic synchronization, and reminder apps for some of this. But overall there is still too much hands-on work to keep our personal technology running smoothly. Even the stuff we can automate often requires manual checks to ensure that it worked properly. Do you really want anything to automatically delete old data without being sure it was properly offloaded or backed up? I realize this question makes the solution even harder but it's a valid point.
What would you jettison?
Is there some aspect of the digital age you hope to bid a final good-bye to this year? Share your thoughts—and peeves—in the comments section.
Scott Matteson is a senior systems administrator and freelance technical writer who also performs consulting work for small organizations. He resides in the Greater Boston area with his wife and three children.