Education

The future of security: Will our brains host botnets?

What does the future hold in store for us and how does IT security fit into the picture of things to come? Chad Perrin envisions the security threats of a brave new digital world.

What does the future -- not tomorrow or next year, but much farther into the future -- hold in store for us? How does IT security fit into this picture of things to come?


I do not have a crystal ball, and I have not yet written a program that predicts future events. We all think about the future from time to time, though, and even speculate about what might yet arise in that future -- especially those of us with an interest in technology.

When we think about the future, though, we usually keep our science-fictional speculations neatly divided from our career planning. Will we be able to send humans to other solar systems in fifty years? What about Mars? Maybe so, and it might be an exciting time for the human race, but it is not common for us to start looking into a career path that is intended to put us in the middle of planning a manned mission to Mars or to Alpha Centauri. Instead, when we think about our careers and skills development, we tend to think about things like attending the Black Hat conference or learning Python.

In the last few years, though, I have been thinking more and more about how my professional skills might be guided toward a future with flying cars and washing machines that communicate with me via brain implants. In fact, such speculations are not even strictly career-oriented in nature.

It occurs to me that, in at least one conception of a future that I think is very likely, secure software development might actually become a survival skill. Somewhere down the line, we are going to start seeing cybernetic implants on the market. The first such thing available for general use might be a direct interface between the brain and the computer. The way things are going in the cellphone world, the computer might even be the implant, with an always-on wireless Internet connection.

Considering the security landscape of today's Internet, though, and the direction security on the Internet seems to be going, that could be a very scary thought to consider. What kind of damage can a computer virus do if it infects a cybernetic implant? Will denial of service attacks only affect our Internet connections, or might they find a way to affect the cerebral cortex as well? Will our brains become nodes in the world's largest botnet?

What about the day that may come when we get implants that allow us to adjust our metabolisms, that can enhance or diminish sensory input on command, or that simply repair replication errors in our cells? What will a computer virus that infects the systems that can do such things actually do to us? Will we one day manage to solve all biological illness problems only to simultaneously open ourselves up to man-made digital infections that can be even more deadly?

As we approach the day when we may find ourselves incorporating computers into our very bodies, people will hopefully become more careful about what kind of software and hardware systems they buy. Today, those of us who worry about Chinese hardware and Microsoft software that could spy on our email are regarded by many as paranoid. In twenty years, it may be perfectly normal to worry about Chinese hardware and Microsoft software that could spy on our most private moments -- perhaps even on our thoughts.

I, for one, hope that open source software is at the forefront of software development for cybernetic systems, for security reasons. I also want to be able to safely and securely tweak the code myself. I am beginning to think it is time to start learning languages like Ada, Go, and Io for secure, massively concurrent development, which may become incredibly important as nanotechnology starts giving us a view into the highly complex future of programmable selves.

How do we plan for the science-fictional future? As technological advancement accelerates with every passing year -- perhaps every day soon -- this question becomes more and more urgent. If things go anywhere near the way I expect, learning security skills may be the most important thing we can do to prepare for the future.

About

Chad Perrin is an IT consultant, developer, and freelance professional writer. He holds both Microsoft and CompTIA certifications and is a graduate of two IT industry trade schools.

136 comments
Slayer_
Slayer_

Is that a medical procedure or a psychiatric one? Imaging all the garbage Keyloggers would get, tracking all our thoughts.

qqr
qqr

For those interested in some fairly reasonable predictions on the future of computational ability and its integration with humanity this book is worth reading.

JCitizen
JCitizen

and just by title, that is rare for me!

apotheon
apotheon

That book is on my "to read, eventually" list. I've got a crapload of other books in the queue ahead of it though, such as Free Culture, The Wealth of Networks, The Future of the Internet -- and How to Stop It, and Metaprogramming Ruby. I'm also working on Stephenson's Baroque Cycle, so, y'know, I'll be reading for a little while yet.

qqr
qqr

Thanks for sharing some of your reading list, it is always interesting to see what sort of associations are made to particular subjects.

apotheon
apotheon

That's oversimplifying things a bit. If I have my computer with me, and an Internet connection, there are probably a thousand other things for me to do that are much more pressing. I'll definitely keep it in mind, though, in case I get a breather.

AnsuGisalas
AnsuGisalas

So, you can read it the next time you're stuck somewhere with nothing but your computer and an internet connection.

apotheon
apotheon

I haven't read any St. Augustine, but I'd like to, just as soon as I find the time. There's so much in the world to read, though. . . .

apotheon
apotheon

More to the point, Santee, your vessel ain't any less leaky than mine -- and I still get to have taste. I don't give people a free pass for anything just because they (debatably) lived thousands of years before me.

santeewelding
santeewelding

You are. Rather more or less expertly so. Rather than take you head-on (much work), I'll content myself with taking hold of one Scotch pennant and another, pulling until you can't but notice how you come apart.

AnsuGisalas
AnsuGisalas

It precludes the awareness of a potential end to the state, as the tempest state precludes all other means of vision than what it itself provides. The eye of the storm, while clear, sees only within it's domain. Chaos is my environment, and I make do with it as I may. It's a good enough place to be, barring alternatives. An end to this chaos is as inconceivable as it is certain, it's my singularity. So, imperfect is the whole of what is, and nothing but what is. Perfect is for those who have already ended.

santeewelding
santeewelding

But incomplete. The "cannot" remains unmodified, for example, by, "yet". That is, unless you have come full-stop, and you make finally do of what you think [i]is[/i].

AnsuGisalas
AnsuGisalas

I've had what everyone has. A lifetime. Sadly, I cannot engineer insight, I can only intuit. It's a shortcoming, I know. I am largely unable to think things through, because the answers comes to me before I can crunch my way there. Shortcircuit visionary; deducive reason is beyond me, I have only abducive and inductive. In short, me always beats I, and all myself can do is watch. :P

JCitizen
JCitizen

Faith generally doesn't make sense to the rational mind. I'm not criticizing Chad for good taste in writing; I'm just stating why I like the way the Bible is written. If it were re-written for "entertainment value". I probably never would have come back to the faith. As a former unbeliever, I am especially cognizant of such things. The dead sea scrolls verify that the new versions are accurate to very fine detail from originals. Some of the animal skins they were written on, are from widely dispersed areas and eras. Even when the Hebrew(and other local dialects) language changed over time, the translation to the new dialects, came through with great accuracy. Until the scrolls were given up for scientific anthropological and archaeological study, I didn't think much of them; but examination from universities that had no axe to grind found the same granular accuracy.

santeewelding
santeewelding

Venerable as death itself; death, which goes way back. So are you. It's that insight you have yet to engineer.

AnsuGisalas
AnsuGisalas

Somehow it's not nice to think that the Bible has blood on it like that. Maybe that's why bloodshed has haunted it ever since? Although I'd like to think that has more to do with the Book of Revelations. BTW: Santee, I'd heard you were venerable, but that sure takes the cake!

AnsuGisalas
AnsuGisalas

From cybernetics to prophets in less than 200 posts... wow! @Apotheon: One thing has to be considered still: the Genesis and Exodus come to you through translation. It is hard to judge whether that translation has focused on other things than literary qualities without comparing to the various "originals", that is, it's various sources, none of which sadly can be said to be a true original. BTW did you know that in the babylonian version of the story of Noah, the boat is in fact a huge cube. That's why it's called an ark, it's a big box, like the ark of the covenant (which by the way is in Ethiopia!).

AnsuGisalas
AnsuGisalas

Sorry if I was being confusing/confused. There are no apocryphical OT sources. I was talking about that same John of the Revelations. Living in a cave, getting talked to about the antichrist and Satan's synagogue and other until then unheard of stuff, which, when put into the religious context of that time makes him stand out as a total loonie. Most of his other texts are in the book of apocrypha, the result of a big early-christian political convention in which were made compromises about which huge amount of post-jesus texts should be canonized into the real bible, and which should be dumped into the X-files of the bible (apocrypha) for the enjoyment of the hardcore fringe. Must have been quite a scene.

apotheon
apotheon

Hah. Nice one. Okay . . . I'm pretty sure you got my reference, but I figure I'll be explicit just to cover all the bases. The T. in my use of John T. Divine is "the", as in "Saint John the Divine", as in "The Revelation of Saint John the Divine", also known as "The Book of Revelations". I now return you to your regularly scheduled subtleties. I've never read any of Moses' non-mainstream work, and have never really felt a burning need to do so. As I've hinted elsewhere in this discussion, though, there are apparently reasons to believe not everything attributed to Moses was actually written by him, much the same way that some books written by other Scientologists (perhaps even after Hubbard's death) have been published under L. Ron Hubbard's name, thus accounting for a significant percentage of Hubbard's supposed body of works.

apotheon
apotheon

1. Ansugeisler makes some good points about the sketchiness of the chronicle that has survived until our time. 2. Saying I am in no position to pan Moses' literary stylings because his writings came to us via the Gutenberg Bible, which served as the foundational work for mass publication, is a bit like saying I can't tell people that the Caesar cipher is really bad cryptography just because it preceded actually good cryptographic algorithms. I guess I should tell people that the Caesar cipher is every bit as good as Rijndael, then, according to Santeewelding's argument. 3. Other books of the Bible are better written than Moses' contributions, so it's not like I'm saying the whole Bible is that bad. I'm just commenting on Moses in particular. 4. Even earlier works than Moses' sported much better story structure, narrative form, and other features of good writing than Genesis and Exodus, so Moses actually represents a step down from previous works. For instance, the Epic of Gilgamesh is a far better constructed tale than Moses', and if we posit the believability of all the supernatural goings-on in both for the sake of argument, the actual presentation in Gilgamesh lends itself more to a feeling of believability than that in Genesis and Exodus. Don't tell me what I'm allowed to claim for my tastes in literature, Santeewelding. The best you'll manage in an argument about that is a claim that you're allowed to have different tastes and opinions. Your half-baked attempt at showing me I'm wrong to suggest Moses wasn't a very good storyteller, to judge by what has survived the ages, doesn't hold much water. . . . and I haven't even touched on the fact that Moses is probably a bit like L. Ron Hubbard, in that it's likely that some of "his" writing in the Bible was probably intentionally misattributed material written by someone who followed in his wake.

AnsuGisalas
AnsuGisalas

Oh, so that's not what the T stands for? Anyway, I must say that not much good can be said for a guy who invented satanism, leviathan and the antichrist (right tripping), and didn't even manage to make it a good horror story. It's like he never took a writing class; show it, don't tell it, man! Seriously, that guy has one of the worst cases of being a false prophet I've ever seen. Most of his writings are in apocryphicals for that same reason, so if you like his mainstream work, maybe you'll like his more underground/hardcore stuff too?

AnsuGisalas
AnsuGisalas

Who's to say that the tradition after Moses is the same as the oral tradition was before? It's a ripe time for an edit, when writing is introduced. There's lots of unanswered questions there. Like, how'd the abrahamites come to pick up Bhaal-worship in egypt, when the egyptians didn't worship Bhaal? And, specifically, because Bhaal was a deity native of the mesopotamian homeland of the Abrahamites (Abraham, or Abram as he known then, went out from Ur, the -h- was a gift from God). So, Moses eradicated the worship of Bhaal, but most likely it was a worship more original to the abrahamites than that of the God of Moses. And then, conveniently, he took them for a forty year trek of a desert that isn't really that big... to make sure they forgot their backgrounds? Moses is an egyptian name, as in Ram(o)ses, Thothmoses, etc. It means -son-of, but apparently Moses' divine father was unnamable. That, of the egyptian culture of that time, fits only Aton. Maybe Freud was wrong after all, and it was the other way around?

santeewelding
santeewelding

You are a Paleface, as I recall. You did not learn how to read and write from Martians. You learned how from a long line of progenitors, who in turn, began with those who at last were able to read the Bible when it first hit the masses with the advent of printing. If so, you are in no flucking position to pan the genesis of your considerable skills.

apotheon
apotheon

It's like saying that because Avatar had a hackneyed, derivative plot with cardboard cutout characters it must be true.

JCitizen
JCitizen

is what makes it believable to me. If he were a good story teller, I would be more inclined to think it was just a story. It was obviously written by someone who really didn't want the distinction in the first place, but was under orders to perform anyway. Anything put together before the flight from Egypt was hand me down vocal tradition to begin with.

apotheon
apotheon

Reading between the lines doesn't change the fact that, apparently, Moses was a terrible writer.

santeewelding
santeewelding

On how you read: the lines, or the lines and between the lines.

apotheon
apotheon

Do you mean the book by Moses? I read that, and its sequel (Exodus). It wasn't a very good read, actually -- though it did have its high points. I thought Revelations (by John T. Divine) was a much more interesting read, overall.

santeewelding
santeewelding

Does not in any way whatsoever comport with existence. If it did, we are screwed. If you do, you are screwed. Which explains your report.

qqr
qqr

So happy I am filled to over flowing with joy and enthusiasm at the very though of human existance.

jstinnett
jstinnett

If you consider the way Rupert Murdoch's Fox "news" has altered the meaning of the word "fact", I suppose we are not far from Chad's Brave New World. In fact it may not be necessary to actually implant a devices in to the body. Take the HAARP project and the mass media brainwashing of the American public by media conglomerates. Read what Dr.Nick Begisch says about HAARP here @: http://www.haarp.net/ Ok, gotta go, my Alien Orders are downloading....

seanferd
seanferd

I've been waiting for a comment on this from him, as the post title would seemingly play right into his hands. What with nanotech, electronic implants, and the Singularity all coming soon to a theatre near you. (Oh, and the robot sex slaves. In the future, there will be robots!)

apotheon
apotheon

Somehow, I'm put in mind of a "sexaroid" from the Bubble Gum Crisis anime series of the '90s -- specifically, Sylvie. There it is. Now you know a bit more about my geeky past.

JCitizen
JCitizen

that is the one I watched. That one shows my age. I may have to rent a few of these, if Netflix has them in stock!

JCitizen
JCitizen

a kid anymore. I'm sure I would find it revolting; other than mild curiosity at history. Anime was something very different for a child of the '60s.

apotheon
apotheon

I've never really found Astro Boy very interesting.

boxfiddler
boxfiddler

they're filthy steeenkeen rich robot sex slaves, I'll take three. etu

sboverie
sboverie

If we think of a botnet as having been infected with a piece of information that causes the computer to spam or attack other computers, then we can see similar actions caused by bad ideas in people. Bad ideas like antisemitism and prejudice are good examples of bad ideas turning people into the equivalent of a botnet. The concept is called "Meme" and it is sometimes a good thing and sometimes evil. The Jonetown tragedy is an example of a meme that infected people into suicide.

apotheon
apotheon

A meme is more like a gene (thus the name), and when Richard Dawkins originally coined the term it was in the context of an explanation of evolution. He postulated a direct correspondence between the evolutionary survival characteristics of genes and memes. Infection is not the primary vector for evolutionary propagation in and of itself, even if it is a key component of the survival characteristics of certain types of genes. Propagation occurs by way of inheritance, and the same is effectively true of memes as well. One simply does not get "infected" by a meme through exposure to some kind of post-incubation symptoms of another infected person. Rather, memes are passed down as a form of inherited trait or consciously adopted. Botnets are even further outside the realm of evolutionary analogy -- and, thus, outside the realm of memetics. A botnet is basically just a widespread infection that allows automated, conscious direction of very specific actions in a large number of infected nodes in the net. While memes can make their carriers more susceptible to influence by those who are well-placed within the meme's paradigm perspective, they do not turn their carriers into directly mass-controlled automatons.

kama410
kama410

I've been thinking about this question for years. I once read a series (sci/fantasy) where this sort of thing was relatively commonplace. I thought it was even more 'future history accurate' than anything I've read by William Gibson. Sadly, I can't remember any of the titles of the books or the author. The Singularity is approaching. Assuming our 'leaders' don't destroy our species through their hubris before it gets here.

boxfiddler
boxfiddler

a [b]huge[/b] assumption. :|

kama410
kama410

Honestly, I don't see how it can be otherwise. The definition of the Singularity is a point in the future after which we, as a species, are so radically altered that we cannot predict the consequences of the change. (Yes, that's pretty rough and in-exact.) Do you think you can grasp the consequences of having near-instant access to any information that is available on the internet, or on a server you have access to? Can you imagine the consequences of having software that will provide you with a means of interpreting seemingly unrelated information in almost real time? How about an expert system that interprets facial expressions of the person you are talking to so you can gauge their reactions to what you say and determine if they believe what they are saying? This is just a tiny bit of imagining of what could be possible when your interface is no longer KVM, but a direct link to your senses. Do you think that it will never happen? How am I making a huge assumption here?

JCitizen
JCitizen

I was talking about earlier. Artificial sight has already been accomplished using this technique. The subject sees a very low resolution image as points of light. The sensory manager collates the image, because each path in everyone's brain is different as you already know. They were able to calibrate this image into something the subject could see; I assume by feedback from the subject of the field study. I find your problem with folks who never had sight fascinating, but I suspect even blind from birth folks have some kind of imagination about imagery; it would just be a different part of the brain, I suppose. However, I also imagine stimulation of the visual cortex would still get a reaction of perceived sight of some kind. Besides freaking out the subject of such a field study, I should think headaches would be a common side effect at first.

AnsuGisalas
AnsuGisalas

What my point is, that you can't feed any data straight into the "mind", because the mind isn't centralized and has no trained-from-birth i/0 point (unless the implant has been done at a very early age, and suitable training has been given). So, you'd have to use already known inputs, like vision or hearing, which puts the datastream into that secondary position, with associated slowdowns. One thing I thought of though, which could be done right now: If someone has lost an appendage (even just a finger) to an accident, then that nerverail can be used as an i/0 for a simple same-type interface; a mouse for example, or a controller for a virtual appendage. If an art painter loses an arm it should be possible to fit him with a virtual arm that allows him to paint on virtual canvases... that'd give digital painting of an unseen quality if the interface is good enough. Similarly, a pianist could play remote gigs, or a carpenter could use a robot cutting tool, etc. etc. That'd be cool. Expensive at first, but it'd be just the kind of visible doing that'd help generate funds for development.

Neon Samurai
Neon Samurai

I grew up on Shadowrun and Cyberpunk settings so cyberware has been a constant food for my imagination. In terms of how it will happen, I agree with Apoth in that it will be an interface. If data-jacks become viable, they will simply be a connection port. The computer will connect into the five senses providing sensory feedback but the processing of that data will still happen within the squishy meat. Virtual reality will not tell us what we taste, see, smell, touch; it will simply tap into our existing sensory systems behind the meat sensor we rely on now. The catch here is that we have to confirm if the data flows over the wires in the same way which it seems to on our electro-chemical wires. If we can emulate the same signal on the wire then the processing unit (brain/spine) simply accepts it. With things that currently have to tap the brain directly, we see the variance in mental development. The computers sending the sensory data must be trained with the few bionic eye test subjects that I've read about. This is clearly not the better way to do it once we are able to wire directly into the optic nerve and other feeder trunks. ( had a point but I think it got lost in there somewhere )

apotheon
apotheon

So what? This has nothing to do with what I'm talking about. The idea is to offload the linear tedium of simple (but lengthy) processing to outside resources, and to interface with those resources. The idea is not to let the computer do all your thinking for you -- which, as you point out, is not really on our radar yet due to the difficulty of that kind of massive parallelism and heuristic evaluation. On the other hand, as the rate of advancement increases, even producing artificial "thinking" machines that really do approximate human-style cogitation will start looking more and more realistic.

AnsuGisalas
AnsuGisalas

One problem with this, is that the brain, much like it's physical structure is fractal, also makes decisions fractally, like, when you're listening to someone speaking, your brain is handling each possible meaning of each word in the series at the same time... producing parallel interpretations of high complexity, so how do you locate that in the brain? And how do you figure out which one is the right one? Also, for actions, do you ever consider how many brainfarts you avert in a second? The brain constantly is going through options, discarding them as it's ongoing analysis of outcomes shows them to be against it's goals... if it manages. Every once in a while one gets through. That's the thing, the brain is not digital, it doesn't every answer yes/no. It emulates a specific layer for that, the so-called consciousness. The consciousness does not exist at a single-neuron level, at least not according to any finding I've heard, it's an emergent entity. So how do you find the place to fit a wired-in I/0 that's accessible to the consciousness? You'd have to find a neural placement for the consciousness... or make do with a sensory modification with all it's drawbacks. Giving the id access to an I/0 is NOT wise, just look at Forbidden Planet.

apotheon
apotheon

How quickly can you turn on your laptop, open the appropriate applications, and start reading something? How much does scrolling slow you down? Now . . . how much more quickly do you think you'd be able to perform the equivalent actions for a direct in-brain connection just by thinking about it -- especially if you can alias multiple-step actions to single-step "commands"? Then, of course, there's the fact that with fiber optic capabilities the physical tech is likely to provide a faster data rate than the electrical impulses of the natural organic nerve connection between brain and ocular instrument. That's not really the bottleneck, though, so it's hardly worth considering as a major factor until we deal with the problem of the speed and usability of the interface (a physical laptop computer).

AnsuGisalas
AnsuGisalas

The consciousness is *SLOOOOOW*. Damn piece of #?%@ bloatware! The subconscious eye is pretty fast, fast enough to allow you to stereo-optically estimate the speed and position of a fastball heading for your head quickly enough for you to catch it... unless you stop to think about it, in which case you'll have a headache in a split-second. The thing is, the subconscious part of the mind is much much much more important than the conscious. The consciousness is like the anchorman, while the subconscious is the whole rest of news network.

Neon Samurai
Neon Samurai

That would be pretty good backup evidence of some lag in visual processing. Just think of how quickly things begin to blur into invisibility through subtle or speedy movement.

AnsuGisalas
AnsuGisalas

Visual cortex still has response times, the physical eyes as far as I understand don't add much. So the data still has to go through a secondary role: sensory representation, perception, interpretation.

apotheon
apotheon

I take it you haven't been following the discussion in other parts of the thread that touch on things like direct input to the visual cortex.

AnsuGisalas
AnsuGisalas

The singularity came and went. Noone could've guessed the outcome of the invention of the transistor for example. Steam engine too. It's still not possible to absorb info from an electronic format except as a secondary function to an ocular or similar sensory task. Right now there's a hold-up in writing, but the hold-up in reading and comprehension is almost as big, so writing without writing doesn't help much.

santeewelding
santeewelding

What happens after more than a few stiff drinks?

apotheon
apotheon

Computers only facilitate getting it into writing. Computers also automate things. The biggest benefits to hooking our brains up to computers will probably be (at least at first): 1. offloading heavy processing tasks to artificial systems designed for the goal of automating tedious work 2. much faster communication between nodes (people) in the network While a lot of this stuff can be done already, it tends to require a lot of data passing through our fingers, and it tends to require lugging around a bunch of external equipment (keyboards, LCD displays, speakers, power supplies and batteries, pointing devices, et cetera). But that's SO far away. Bah, humbug. The Singularity is Near. Every time there's an advancement in the processing speed and power of information technologies, the speed with which we can achieve the next information technology advancement is potentially accelerated. We get to experience at least an exponential growth curve as long as nothing prevents us from leveraging new advancements in the process of making more advancements. If by "far" you mean "wow, ten whole years" or something like that, okay. If you mean a hundred years, though, I think you're missing the bigger picture here.

AnsuGisalas
AnsuGisalas

Either we do, or we don't. Fifty-fifty :p

AnsuGisalas
AnsuGisalas

Computers only facilitate getting it into writing. Ok, so accessing files (writing) somewhere else might be neat, but it's still not in a format that the brain can actually use on a basic level. So, at least just internet hookup is not enough of an event. If they figure out how to make information available to the brain in a useful way without the go-between of reading and understanding... that'd do it, sure enough. But that's SO far away.

kama410
kama410

Sorry, I misunderstood. Yes, someone predicted that we had about a 50% chance of survival over the next hundred years. I think that's fairly optimistic, really.

boxfiddler
boxfiddler

[i]Assuming our 'leaders' don't destroy our species through their hubris before it gets here.[/i]

apotheon
apotheon

I think the huge assumption in this case is assuming the our "leaders" don't succeed in destroying the species first. If they can refrain from doing that, the technological Singularity appears to be pretty much unavoidable.

Editor's Picks