Monthly Archives: April 2021

Do your computers have ID? The state of machine identity management

Ponemon Institute and Keyfactor kicked off the first-ever State of Machine Identity Management Report with one purpose: Drive industry awareness around the importance of managing and protecting machine identities, such as keys, certificates, and other secrets, in digital business.

For the 2021 State of Machine Identity Management Report, Ponemon Institute surveyed 1,162 respondents across North America and EMEA who work in IT, information security, infrastructure, development, and other related areas.

We hope that IT and security leaders can use this research to drive forward the need for an enterprise-wide machine identity management strategy. No matter where you are in the business – IT, security, or development – and no matter the size of your company, this report
offers important insights into why machine identities matter.

In recent years, we’ve witnessed the rapid growth of internet-connected devices and machines in the enterprise. From IoT and mobile devices to software-defined applications, cloud instances, containers, and even the code running within them, machines already far
outnumber humans.

Much like the human identities we rely on to access apps and devices we use every day (e.g., passwords, multi-factor, etc.), machines require a set of credentials to authenticate and securely
connect with other devices and apps on the network. Despite their critical importance, these “machine identities” are often left unmanaged and unprotected.

In the 2020 Hype Cycle for Identity and Access Management Technologies, Gartner introduced a new category: Machine Identity Management. The addition reflects the increasing importance of managing cryptographic keys,  X.509 certificates, SSH keys, and other non-human identities.

Machine identities have undoubtedly become a critical piece in enterprise IAM strategy, and awareness has reached even the highest levels of the organization. Sixty-one percent of respondents say they are either familiar or very familiar with the term machine identity management.

“Machine identities, such as keys, certificates and secrets, are essential to securing connections between thousands of servers, cloud workloads, IoT and mobile devices,” said Chris Hickman, chief security officer at Keyfactor. “Yet the survey highlights a concerning and significant gap between plan and action when it comes to machine identity management strategy. Acknowledgment is a step in the right direction, but a lack of time, skilled resources and attention paid to managing machine identities make organizations vulnerable to highly disruptive security risks and service outages.”

In this section, we highlight key findings based on Keyfactor’s analysis of the research data compiled by Ponemon Institute. For more in-depth analysis, see the complete findings.

Strategies for crypto and machine identity management are a work in progress.

Despite growing awareness of machine identity management, the majority of survey respondents said their organization either does not have a strategy for managing cryptography and machine identities (18 percent of respondents), or they have a limited strategy that is applied only to certain applications or use cases (42 percent of respondents).

The top challenges that stand in the way of setting an enterprise-wide strategy are too much change and uncertainty (40 percent of respondents) and lack of skilled personnel (40 percent
of respondents).

Shorter certificate lifespans, key misconfiguration, and limited visibility are top concerns.

Challenges in managing machine identities include the increased workload and risk of outages caused by shorter SSL/TLS certificate lifespans (59 percent of respondents), misconfiguration of keys and certificates (55 percent of respondents), and not knowing exactly how many keys and certificates the organization has (53 percent of respondents).

A significant driver of these challenges is the recent reduction in the lifespan of all publicly-trusted SSL/TLS certificates by roughly half, from 27 months to 13 months, on September 1, 2020. It is worth noting that the real impact of this change will likely not be realized
until the months and years ahead.

Crypto-agility emerged as a top strategic priority.

Moving into the top position on the list, more than half of respondents (51 percent) identified crypto-agility as a strategic priority for digital security, followed by reducing complexity of IT infrastructure and investing in hiring and retaining qualified personnel (both 50
percent of respondents).

Cloud and Zero-Trust strategies are driving the deployment of PKI and machine identities.

While many trends are driving the deployment of PKI, keys, and certificates, the two most important trends are cloud-based services (52 percent of respondents), and Zero-Trust security strategy (50 percent of respondents). Other notable trends include the remote workforce and IoT devices (both 43 percent of respondents).

SSL/TLS certificates take priority, but every machine identity is critical.

Overall, respondents agree that managing and protecting every machine identity is critical. That said, SSL/TLS certificates were widely considered the most important machine identities to manage and protect, according to 82 percent of respondents.

To see the report’s full findings, visit KeyFactor.com’s website 

 

What’s the original sin of the Internet? A new podcast

Bob Sullivan

Is there an Original Sin of the Internet? Join me on a journey to find out.

Today I’m sharing a passion project of mine that’s been years in the making. I’m lucky. I’m getting old. Much better than the alternative! My career has spanned such a fascinating time in the history of technology. I learned to *literally* cut and paste at my first newspaper. Now, most of the world is run by computer code that’s been virtually cut and pasted. Often, carelessly cut and pasted. Look around, and it’s fair to ask: Has all this technology really made our lives better? My answer is yes, but by a margin so slim that objectors might call for a recount.

Whatever your answer, there is no denying that tech has landed us in a lot of trouble, and the techlash is real. And for those of us who thought the Internet might end up as humanity’s greatest invention, this time is depressing. One of my guests — a real Internet founder — thinks perhaps he should have done something else with his life.

Debugger, launching today, is a podcast, but I think of it more as an audio documentary. There are no sound bites. I let my guests talk and try to stay out of the way. So you can make up your own mind. Thanks to the great people at Duke’s Sanford School of Public Policy and the Kenan Institute for Ethics, I have access to amazing people who were there at the dawn of the Internet Age. I hope you’ll listen, but if you’d rather read, I’ll spend this week sending out edited transcripts from each guest.

First up: Richard Purcell, one of the first privacy executives. From him, you’ll learn as much about working on the railroad as you will about the abuse of power through privacy invasions. But before that, I try to explain what I mean by “original sin” in the introduction, and why that matters.

Future Debugger episodes will deal with similar foundational questions about technology and its role in democratic society. Why do 1,000 companies know every time I visit one web page? How do data brokers interfere with free and fair elections? What should we do with too-big-to-fail tech giants? How can we capture medical miracles trapped in data without violating patients? And how can we build tech that isn’t easily weaponized by abusing people or enemy combatants? That’s coming soon, on Debugger. On to the transcript for today. Click here to visit the podcast home page. Or, click below to listen.


[00:01:27] Bob Sullivan: Welcome to Debugger, a podcast about technology brought to you by Duke University’s Sanford School of Public Policy and the Duke Kenan Institute for Ethics. I’m your host, Bob Sullivan. And I care a lot about today’s topic. So please indulge me for a moment or two while I try to frame this issue.

I came across a story many years ago, it still haunts me as a technologist and an early believer in the internet. It haunts me because it reads like a sad pre-obituary about a once-famous pop singer who’s now a broke has-been with a drug problem … and as a writer because its prose is nearly poetry. At least to my ears, the kind of thing I wish I’d written credit. Steve Maich at Maclean’s and Canadian magazine for the words. Dramatic reading by old friend Alia Tavakolian:

[00:02:24] Alia Tavakolian: The people who conceived and pioneered the web described a kind of enlightened utopia built on mutual understanding. A world in which knowledge is limited only by one’s curiosity. Instead, we have constructed a virtual Wild West where the masses indulge their darkest vices, pirates of all kinds troll for victims, and the rest of us have come to accept that cyberspace isn’t the kind of place you’d want to raise your kids. The great multinational exchange of ideas and goodwill has devolved into a food fight. And the virtual marketplace is a great place to get robbed. The answers to the great questions of our world may be out there somewhere, but finding them will require you to first wade through an ocean of misinformation, trivia and sludge. We’ve been sold a bill of goods. We’re paying for it through automatic monthly withdrawals from our PayPal accounts.

Let’s put this in terms crude enough for all cyber dwellers to grasp: The internet sucks.

[00:03:23] Bob Sullivan: The internet sucks? I’ve thought about this story for years, come back to it once in a while, but it’s been a while. In fact, it’s been 15 years since those words were first written, a lot has happened since then.

·         My name is Ed Snowden. I’m 29 years old. I work for Booz Allen Hamilton as an infrastructure analyst for NSA in Hawaii.

·         What exactly are they saying these Russians did? ….Well, there’s a lot of things that were alleging the Internet Research Agency did. Um, the main thing is that they posed as American citizens to amplify and spread content that causes division in our society

·         Tonight, Facebook stock tanking, dropping nearly 7% after allegations that data from Cambridge Analytica secretly harvested the personal information of 50 million unsuspecting Facebook users,

·         Cyber experts warn the Equifax hack has the potential to haunt Americans for decades. And every adult should assume their information was stolen.

·         Social media is just one of many factors that played a role in the deadly attack on the U S. Capital, but it’s a huge one that attack was openly planned online for weeks.

Bob Sullivan: If the internet sucked in 2006, what should we say about it now? I remember being an intern with Microsoft in 1995, a small part of the launch team for Windows 95. I helped launch internet news. I remember feeling at the time … it was very heady. Like John Perry, Barlow the co-founder of the electronic frontier foundation and a Grateful Dead lyricist…We both felt the internet could one day rival fire in the importance to humanity. Well, actually what he said was it was the most transforming technological event since the capture of fire.

So I think we should all admit we haven’t captured the internet. It’s a lot more like an uncontrolled fire right now. Or maybe like a wild animal. We haven’t domesticated. Not yet. Anyway, how did this happen? How did we lose control of it? Where did we go wrong? Was there some original sin of the internet? A moment when we turned right, as we should have turned left. Looking backward, isn’t always worthwhile, but sometimes it is. When you’re doing a long mathematics calculation and you make a mistake, it’s not possible to erase the answer and correct it. You have to trace your steps back to the original error and calculate forward anew.

I think it’s time. We did that with the web.

Maybe this seems like an academic question, but it’s not. The coronavirus pandemic has taught humanity a very painful lesson by now. We’ve all come to realize that like it or not, we’re in this together. We can’t rid half the planet of COVID-19 and hope for the best. That won’t work. We have to all pull in the same direction. All do the things we need to do. Wear masks, avoid indoor spaces, vaccinate when we can …to get and keep the virus on the run. And that won’t happen if we don’t all agree on the same set of facts. But right now the most powerful disinformation machine ever, the biggest lie spreading tool ever, seems to have truth on the run.

So it’s not just academic, it’s personal. It’s life or death.

How do we capture digital fire? How do we domesticate the wild animal that is the internet. The best way to get out of a hole is to stop digging. So I want to begin there.

For the next 45 minutes or so, I’m going to pursue this question of an original sin with the help of a series of experts who were there. As you’ll find out, while some of them might not like the way I frame the question, no one disagrees with the basic premise: We’ve built fatal flaws into our digital lives and we’d better fix them fast.


My first stop is with Richard Purcell. We were at Microsoft together. He was chief privacy officer at Microsoft, one of the first people to ever hold that title, back when I was a cub reporter at msnbc.com, we hadn’t talked in years. I caught up with him on Data Privacy Day, a holiday that’s been celebrated for more than a decade in the U.S. though, perhaps you don’t celebrate it.

—-

Bob Sullivan: Okay. So I forgot by the way, to wish you a happy data, privacy day,

[00:07:52] Richard Purcell: This is Data Privacy Day, it is the 28th. And you know, today we in an odd way, Bob, in the United States, people like me and you and others ascross the United States are celebrating the Europeans decision to ensconce privacy as a fundamental human right. Um, There are people who would say, gosh, you know, we shouldn’t be celebrating foreign countries, foreign regions, uh, social awareness. We should be doing it ourselves.

[00:08:24] Bob Sullivan: Richard took what you’d think of today as a very unusual route to an executive job at a big software company.  But then when Richard was a teenager, there really weren’t big software companies.

Richard, when I was preparing to talk to you today, I read a little bit about you. And learned some things I didn’t know. Um, including you work at a railroad maintenance when you were a kid.

[00:08:49] Richard Purcell: I did. I did. I like to ask people about what they did in their 18th years.  So  imagine you graduated high school, you’re perhaps off to university or some other, life, study to launch yourself into adulthood, what’d you do? And, and I’ve asked that question for a lot of people and I’ve had fascinating answers. Privileged people haven’t done much in my opinion, and in my research, which is anecdotal.

But what I did is I went out on, the Union Pacific track lines and I repaired railroad tracks for two summers in a row actually to pay for, for university tuition. So I sweated in the hot sun swinging hammer and pushing railroad steel around and pulling out and putting back in creosote timbers for ties and all of that kind of stuff.

It’s what’s called a Gandy dancer. That’s when you have one foot on the ground and one foot on your shovel and you’re pushing rock underneath a railroad tie in order to secure it and keep it from moving in. That’s the Gandy dance. When you get 20 people out there, Gandy dancing, it looks pretty funny.

[00:10:01] Bob Sullivan:  Richard’s work on the railroad provides an interesting metaphorical starting point for our discussion.

[00:10:08] Richard Purcell: I’ve repaired a few derailments down in the on the Columbia River, where locomotives are on their side in a slough puffin and still running and pushing bubbles into the dirty water. It’s pretty, it’s pretty bizarre when you’re working on a river.

[00:10:23] Bob Sullivan: I feel like you just described the state of the internet.

[00:10:27] Richard Purcell: I know.. Don’t you think? Yeah. Laying on its side puffing. Yeah, no I I’m with you, you know, maybe, maybe that’s true, Bob, maybe it’s not. Because I predicted when Facebook faced its Cambridge Analytica scandal, which was a tremendous scandal and, and was, uh, not only an impeachable offense, but one which they should have been convicted for that their, that their value would eventually drop.

That it would take a while, but their value would eventually drop, frankly. It just hasn’t. The users of these internet services seem to be highly resistant to the social ramifications of the kind of negative effects of those companies. And, you know, is somebody worth $62 billion to exploit the the world’s social fabric? I don’t think so. That’s not a bargain I would want to make. But it’s one we have made.

[00:11:30] Bob Sullivan: Richard’s unusual path to the tech world colors his perceptions about the internet today, and about the role of power in social circles and in leadership

[00:11:39] Richard Purcell: I grew up … strictly the 50s, middle-class easy, no problem life, but you know, but absolutely no prosperity whatsoever. But what I saw was in everyday life is that there are these power relationships that are unfair. Those with power, even in a small town, like I grew up in are loathe to give up that power. And for some reason are inuredo the fact of their privilege;  they feel like their privilege is an entitlement.

I worked in the forest. I’ve done a lot of things. I ran a grocery store. I started a newspaper. I did all these things in communities and the vibrancy and the health of a community is what I find lacking. And leadership begins to be tainted by the objective of actually maintaining a power relationship instead of sharing it, or instead of using it more, to create more community vibrancy and health, I find those practical experiences made a big difference in my life.

[00:13:05] Bob Sullivan: It seems like you connect to privacy to power, maybe more than someone else might.

[00:13:11] Richard Purcell: [00:13:11] Oh, it is not power. Yeah. Yeah. It’s unquestionably about power. If I can know enough about you, I can manipulate you without a question. And, and that is a power relationship and the more I more successful I am in the more clever I am about that.And the more disguised I am about my motivation, uh, the more advantageous it is to me. But yes, the lack of privacy is the lack of power. Without question, because frankly it is the lack of dignity. It’s the lack of, of control over my own life. And in fact, the European Union … we celebrate data privacy day today.. the European Union’s basis of data protection is the freedom to develop a personality. That’s the language that they use when they promoted data protection and privacy some 40 years ago. And so the whole idea that you are free to develop your own personality indicates how much of a power relationship.

[00:14:21] Bob Sullivan: So if data equals power and privacy is about power. And 40 years ago, people were thinking about this, where do we go wrong? Where did the engineers drive the train off the tracks? If you will. Richard, what is the original sin of the internet?

[00:14:38] Richard Purcell: The original sin of the internet to me is a failure on our part to key in on the basic question of just because I can do something, doesn’t mean that I should do it. In other words, if I can engineer something … internet history demonstrates that because I can engineer it, then I should use it in any way that that engineering allows. And that’s just, isn’t how life should work. We’ve had many, many follies in our time over that. I don’t want to get overly dramatic about that, and I don’t want to use too harsh and examples of that, but the question really is the internet was developed as an electronic means of communication without regard to the content of that communication, largely because the engineers enabled scientists and researchers to communicate with one another. And they had benign intents for the most part. And it was never thought that anybody using it would have any other kind of intent.

[00:15:50] Bob Sullivan: Our first history lesson of this podcast. We’ll talk a lot about naive take going forward. And we’ll also talk about the word privacy, which I’m here to tell you is always a pretty big risk as a storyteller.

I think the conversation we’re having, you know, if we had it three or four years ago, it would have felt a really academic and be pretty boring to most people.

[00:16:12] Richard Purcell: It has been, you’re right. It has been very boring. I’ve bored people for a long time with this kind of. Gosh, what if, jeez, you know, shouldn’t it be this way or that way? And then the stark reality comes with Cambridge Analytica and, Oh my gosh, look at this. We can manipulate people.

[00:16:31] Bob Sullivan: But I think what is new to people is, okay, it’s one thing to manipulate them into buying a certain brand of toothpaste. It’s another thing to manipulate them. Into not believing in democracy anymore.

[00:16:42] Richard Purcell: Isn’t that the truth? I mean, now that nefarious, you know, characters really have some sophisticated controls, not just blunt instrument controls, but sophisticated controls, and have clear objectives.

It’s hard to understand isn’t it, Bob? What the clear objective of somebody who wants to create an unconstitutional limit on free and fair elections. What would their clear objective be? And there’s no way that’s a beneficent objective. That’s a very much a malicious objective, um, because it means about the accumulation and centralization of some kind of power and authority and control over large populations.

That’s what’s frightening me the most is there are…there are actors in the background who have a clear objective to create a centralized, powerful control mechanism. Um, and democracy is standing in its way.

[00:17:52] Bob Sullivan: Democracy is standing in the way. Thank goodness for that. Except when this new digital battleground for control was built, we didn’t have great models to rely on. So we borrowed heavily from the one we had and that, well, that might actually be the wrong left turn we made.

[00:18:10] Richard Purcell: In the United States, our commercial world runs largely by a model from telecommunications history, way back in radio and television that said, Hey, you know, it’s free to use. We just have advertising to support it.

So you don’t have to subscribe to it. And that was back when it was an airwaves broadcast methodology. That model, unfortunately, it’s persisting, even though it’s not an airwaves model anymore, that by which we transmit this information and communicate, but still that free access to online content persists with the underlying advertising model. And they have very strong reasons to believe that. Advertising as a model has its own dark side. And we see that and we see that from all kinds of points of view, of course. Um, but Google and Facebook are. They’re not technology companies as much as they are advertising companies, Google, and Facebook, and really all internet companies.

[00:19:14] Bob Sullivan: They’re all advertising companies now, but this is a very different kind of advertising. The best TV could do was create programming that probably attracted 18 to 34 year olds. Things have changed and changed fast

[00:19:32] Richard Purcell: Narrow casting means that I can put out a blog. I can put out a podcast, I can put out a website that has a very narrow audience, but the fact is even a narrow audience in global terms can have a large population and therefore create more advertising contacts. And as a result, Better monetization. Those issues are just a profound part of how the internet works.

[00:20:00] Bob Sullivan: It sounds obvious to say that privacy stands in the way of this business model. Is that true?

[00:20:06] Richard Purcell: [00:20:06] Absolutely. No question about it. Privacy is not friendly to the advertising model of monetization and content narrow casting because frankly. The basis of advertising is for the internet particularly, but has always been the demographics of the audience