Xerox's CISO Alissa Abdullah discusses how innovation in technology and security has changed throughout her career.
CNET and CBS News Senior Producer Dan Patterson sat down with Xerox CISO Alissa Abdullah to discuss how innovation in technology and security has changed throughout her career. The following is an edited transcript of the interview.
Dan Patterson: Your career spans mathematics, politics and now cybersecurity. Before we get to the crazy interesting future that is printers as IoT devices that need to be secured, I'm wondering if you could help us contextualize where your career began and a little bit of your time in Washington, how that informed where you are now?
Alissa Abdullah: Sure, so I am now the Chief Information Security Officer for Xerox, but I started out as a Mathematician for an Intelligence Agency.
I'm a certified cryptologic engineer and just kind of learning the ins and outs of cryptography and how you link that in with systems really gave me a perspective of cybersecurity, I think a very different perspective of cybersecurity. But I've stayed in the technology realm all of my career. Starting out as a Mathematician and even moving into information technology and then moving to cybersecurity. As I began to kind of develop secure systems for intelligence agencies and within the government, I was actually working at the Pentagon when I got the call saying, We want you to come and join the administration and I was completely, completely shocked.
But, going from government, within the Department of Defense, I was actually with NSA, going from there, and then moving to industry, because I was the Deputy Chief Technology Officer for Lockheed Martin, for one of their business units and the moving to small business because that's the contract that I was on at the Pentagon, really gave me industry experience. It gave me some private sector experience, it gave me public sector experience. All mixed up into one person with all of these different perspectives and all of these different pieces of influence and information at my hand, really shaped and formed me for the Executive Office of the President as the Deputy CIL.
Then you take that and bottle that up, jam all of that information in and now, the sense of Xerox, it gives you a different perspective on saying no to innovation because of security saying no to technology, because of security and how you want to present policy and move policy and actually over the cybersecurity conversation.
SEE: IT leader's guide to the future of artificial intelligence (Tech Pro Research)
Dan Patterson: So I think one of the things that I find most fascinating about your career is that it does kind of span—there's really geeky, hardcore math in cryptography to small business, to politics and to the enterprise. Along with that, you've seen this chain of innovation as well, so I wonder if you could tell me, now that we kind of understand your career trajectory, how has innovation in technology changed and how has security changed as your career has evolved?
Alissa Abdullah: Sure, so we started out, if I think about where I started really paying attention to technology, it was at a very age of society resistance to technology. If you think about security organizations before, they were the organization of no. They were Dr. No. You ask, you want to do something, no, we're not gonna be able to do that. The most secure system is one that is not open, that's one that's not connected.
Moving from that, we, I think at a very fast pace, jumped into the consumerization of IT, right? That means being driven by IT, by consumers. People wanting to have more access, people having that access. Even though businesses were not capitalizing on technology, the consumer population and their personal lives were able to capitalize on technology.
That gives us a different feel in trajectory and pace. My career trajectory has spanned over 20 years, it has definitely moved very quickly throughout each of the phases of technology and now, when we think about internet things, when we think about AI and machine learning and the possibility that it has for us, we don't have the same bounds as we used to. We don't have the same resistance as we used to because I think we've learned that we can't have that amount of resistance.
Dan Patterson: So when you say, "When we used to", put us in a place and time and how have things... it seems like you're saying we've come 180 degrees from that place in time-
Alissa Abdullah: Oh sure.
Dan Patterson: But where were we and what era are we talking about there?
Alissa Abdullah: I think probably, I don't want to age myself, but I think probably in the 90s was when I started to really experience resistance.
I think we wanted to do new things, we were being introduced to... Google Cloud was Google Cloud before it became Google Cloud. We were using web mail in many instances and didn't really have that concept that it was Cloud. We were using, even internet... fast internet, or probably back then, it might have been dial up still at that point. I just remember there were certain parts and at that time, we allowed innovation to be led by not the consumer, because the consumer was still learning, but businesses were trying to trinkle in the thoughts and ideas. Once the consumer picked up on it, then now it began to roll very, very quickly and the acceptance of technology began to roll quickly.
You think about Netflix. Netflix was streaming and doing specific things long before it became cool, long before it became very, very popular. Now, they've grown that. They saw something and saw and developed a product before it was even asked for. Same thing as Apple. Apple developed a product before it was even asked for. Who knew, who thought we would have been able to have a phone with a camera on it and my address book at the same time and all of these different things just in a very, very small modular device? No one even could put our minds around that. Now, we're really talking about robots being able to take control and drive cars for us and kind of do a lot of other different things.
When I think about those different phases... I'll give you another experience. When President Obama entered the White House, we had desktops and floppy drives. That was what, 2008 when he first came in? Well he was elected then. When you think about that and that was all based on nothing against any administration, but it is how we were accepting technology at that time. You had a President that came in and said, You know what, I want technology. I want us to... He's the president that won on big data. He's the president that won on the influence of technology and how to really use technology. So then, there was a major shift and now the acceptance of technology within the White House. We had a government CIO who really pushed technology and closing of data centers and things like that. It's those ideas I can think of specific points in time where things happened and it took a major shift.
Dan Patterson: Tell me one of those points in time. There are a few things you said I want to come back to, especially the resistance, which I assume was in the enterprise or resistance to innovation and then the inflection points, which sounds like the mobile device.
Before we talk about that, tell me some of these moments in time, you remember as real standout— like hey, things are changing.
Alissa Abdullah: I'm gonna talk about a time that's like now, where I think things are changing and people are still grappling with the acceptance of it and that is electric cars. There's a lot of discussion around whether we've gone too far. I've been in some discussions of whether AI is taking over and we're losing our ability to reign it all in. I think those same type of conversations were being had when we started talking about the Cloud.
When we started talking about Cloud and having our data out there in the Cloud, some nebulous thing that no one knew what it really meant. There was a lot of-
Dan Patterson: It's just someone else's computer.
Alissa Abdullah: Right. There was a lot of conversation of, is that what we really want to do? We're used to holding our own data, we're used to being able to go in the data center or on our computer and see I know it's on the C Drive and blah blah blah. We kinda go in those specific places and we can find it, but now we think, I don't want it. I have more capacity. I have more freedom. I can access my data across all devices if it's sitting in the Cloud and not on my computer at home.
I think that same shift is happening now, when we think about the advancement of AI, modern technology and I bring up electric cars. Not just electric cars, but I have been in cars that do a lot more than you think. The autopilot function is just amazing. I think it is our human beings innate sense to control. We have to give up some of that control and I've been talking in some technology areas where, if we give up that control, it may help reduce the shortage of talent. We always talk about the shortage of talent in technology, but we could talk about the shortage of talent. You've probably talked to a lot of other industries and they think there's a shortage of talent in their industry as well.
I think we need to fine tune AI that will help us move that shortage of talent shorter and shorter and shorter by using the advancements of AI and machine learning, but that's a whole different conversation. I think these are pivotal points, because I think we're at a pivotal point right now.
Dan Patterson: What is that?
Alissa Abdullah: The pivotal point is how are we going to accept or how much of AI are we going to accept? It is our ability to consume it and understand it and realize that this is happening. That is going to be the pace in which we adopted.
Dan Patterson: Okay, I have questions for you here. I want to get to security in just a moment, but I think that this conversation of machine learning and artificial intelligence is incredibly important because what you just said. This is somewhat of an inflection point, just like mobile was a decade ago. Particularly, when we look at larger conversations around technology, specific with Silicon Valley, but we do see these mono cultures that are programming algorithms.
What challenges do we have when it comes to the technology skills gap and these mono cultures programing things that will influence millions, if not billions of people like machine learning?
Alissa Abdullah: I think there's number one, the technology skills gap. We don't have enough people. We don't have enough people that even though we have so many people interested in technology, we don't have enough. Then, when you say you don't have enough, then you have to scale what you've got to scale what you have into all of the different areas of technology.
When I was at PARC, Xerox has a research center called PARC, one of the engineers told me there, it takes a human being three times to learn something. It takes a machine maybe 3 million times to learn.
3 million times to learn that that thing is indeed what it is. With that thought, you think about the technology gap and the learning and trying to train people, though it may take a machine 3 million times to learn something, once it learns it, it continues to learn, continues to grow and begins to immediately apply. We kind of sometimes over process and may over synthesis and that's good in some areas and not good in other areas.
When you think about over processing and over synthesizing certain things, that can kinda be a hold up. When I think about applying AI... I'll just talk about this really quick, when I think about applying AI in security, they're a lot of low level security tasks. I'll take it a step further, there are a lot of low level technology tasks that we maybe don't have to have and systems that men do. We can have some type of AI bot do that task for us, so that we as humans, the smarter people, work on the higher processing, the things that need a higher amount of intelligence.
That's how I think you manage this whole talent gap or talent shortage. You allow AI to do the smaller things, the data processing type of things. I can think of certain things. When I think about our networks and closing ports that don't need to be opened, no bot can do that. I don't need to pay an engineer... a systems engineer or a security engineer to do that.
Dan Patterson: To shut down port 80.
Alissa Abdullah: Right.
Dan Patterson: I totally understand what you're saying and that we will mechanize and automate certain systems, but how do we deal with... and I suspect when we zoom in and we look at 2019, 2020, we are going to see a ton of new stories about facial recognition technology gone wrong or AI bias that has done some harm to some group, somewhere. It doesn't have to be... right now, we're talking about bros in Silicon Valley. It doesn't have to just be that group, but anytime you get one group programming for something that affects a lot of people, you get BIOS programmed into the systems.
Anytime I have any conversation about AI and especially AI security, my fears always go to the BIOS that are inherent or maybe implicit or we don't even think about these BIOS, but they end up in the algorithm. How do we keep ourselves safe and secure? How do we keep our companies secure, our government secure if our algorithms are being programmed by a small handful of people?
Alissa Abdullah: I think it's an age old conversation. System engineers have always said, good data in is good data out.
I think it just goes back, when you really boil it down, it goes back to that foundation. Good data in, good data out because my concern is not the BIOS, but all you have to do... you think about that example that I gave, three times for a human to learn something, 3 million times for a bot or something like that or AI engine to learn something.
Let's say I've gotten to 2 million times and at 2 million and one, I get a bit of bad data. Now I've got to start all over again. So I think, when you talk about good data in, good data out, we have to protect the data set. I think the protection of the data set may be at a more important level of influence, then the BIOS that may be programmed into the bot. I say that because the adversary is going after the data set.
That's the easiest way to penetrate an AI engine, I think is to penetrate the data set. Add one piece of bad data in and now your bot is going bonkers and you're trying to figure out why and it may not be because of pre-programmed BIOS, but it may be because of a bad data set.
- Machine learning: A cheat sheet (TechRepublic)
- Artificial intelligence: A business leader's guide (TechRepublic download)
- It leader's guide to deep learning (Tech Pro Research)
- What is AI? Everything you need to know about Artificial Intelligence (ZDNet)
- 6 ways to delete yourself from the internet (CNET)
- Artificial Intelligence: More must-read coverage (TechRepublic on Flipboard)