Monthly Archives: February 2026

‘Sugar High’ — Is AI the Future of music, and work?

Bob Sullivan

The next time you fire up your favorite streaming service, the music you hear might be made by a robot. Maybe you don’t care; you’re just looking for something to help you kill those 30 minutes on the treadmill.  But you should care. The sound you might not hear is a canary in a coal mine that’s gone silent. If “human” musicians can be replaced by bots, so can you.

That’s why we’ve just released a new four-part miniseries on the future of music over at the Debugger podcast, which I host for Duke University’s Sanford School of Public Policy.  Grammy-nominated folk singer Tift Merritt is our guide through this complicated cultural and economic story.  The series is called “Sugar High.”

“I think that everyone is a little shell-shocked from streaming, and it’s very hard to get your mind around things getting worse than that,” Tift told me.

The vast majority of music fans don’t know how much streaming services have changed the economics of the music business, but Tift makes it crystal clear. The series follows Tift as she enters the studio to record her first new album in almost a decade — she’d taken time off to raise her daughter.  She’s going to spend about $50,000 to make the record, a bargain by historic standards. But to earn out that advance, she’ll need about 10 million streams on a service like Spotify.

Ten million streams! Just to get back to …$0.

Tift Merritt is, as I explain in episode one, a huge success story. Don Henley covered one of her songs. She toured with Elvis Costello. She has several songs with millions of streams. Her record Tambourine earned a Grammy nomination for country album of the year. And yet, her ability to earn even a middle-class living as a working musician and mom is….well, it doesn’t really exist any longer.

There’s no arguing that tech has made more music available to more people, and it has made it easier for unknown artists to share their undiscovered talents with the world.  That was always the promise of the Internet. But along the way, the path towards discovery has narrowed, as the spoils of the system have been siphoned off by tech companies.

“I remember in 2010 I put a record out and I got my first royalty statement and I realized what a huge impact streaming was on our economy. It was a fourth of what I usually got, and I realized that I could no longer live in New York City. I couldn’t afford it,” she said. “So…oh my God, shouldn’t I be a dental hygienist? This is a, an equation that is broken.”

But that problem pales in comparison to the storm clouds gathering around artificially-generated music.  Artists and record labels alike are worried that “robots” — trained by ingesting decades of music recordings — will generate endless royalty-free ghost music.  Those songs will fill listeners’ playlists, crowding out real art, leaving musicians like Tift without revenue streams.

That future feels overstated.  Listeners will reject soulless music, won’t they? Like so much of today’s AI conversation, this debate is full of hyperbole and puffery, investment bubbles and doomsayers. One can imagine AI tools being part of human music creation, just as synthesizers and sampling have been used to make art. But one can also imagine large tech companies making the decisions that suit them, artists and art be damned.

One thing is certain: absent some other force, cost-cutting will drive the outcome. If AI ghost music is more profitable than real music, it will replace art and artists. Just as AI will replace lawyers, and journalists, and….every other kind of work that can be done cheaper by software.  How do we prepare for this? How do we design outcomes that benefit society as a whole, rather than a small set of investors?  It’s a conversation we need to have right now.

Of course, this conversation deserves far more nuance than I just gave it, so that’s why this miniseries is just the start of a dialogue.  Later in the series, we’ll hear from Reid Wick of the Recording Academy of America (the Grammy people) and Jen Jacobson, Executive Director of the Artist Rights Alliance. We’ll be having more interviews at Debugger after we release this four-part miniseries.  I hope you’ll be part of the conversation, too.

I do hope you’ll listen to this series by clicking play below, by clicking off to Spotify, or by finding it on your favorite podcast service. But if podcasts aren’t your thing, a transcript is available here.

 

 

Trends in PKI Security: A Global Study of Trends, Challenges & Business Impact

Organizations are struggling to keep their Public Key Infrastructure (PKI) secure and reliable. The primary reasons are the difficulty in keeping pace with managing an average of 114,591 internal certificates, the lack of in-house expertise, the use of manual operations, legacy infrastructure and poor visibility.

Public Key Infrastructure (PKI) is a framework for creating, managing, and validating digital certificates that establish trusted digital identities for users and machines, (i.e., machine identities such as workloads, containers, IoT devices, and services) enabling secure communications and transactions through cryptographic techniques. PKI primarily uses public key (asymmetric) cryptography, though it often works alongside symmetric cryptography, to provide authentication, encryption, and digital signatures.

Few organizations have a high level of confidence in their PKI’s ability to meet compliance requirements. Respondents were asked to rate the effectiveness of and confidence in their organization’s PKI on a scale of 1 = low effectiveness/confidence to 10 = high effectiveness/confidence. As revealed in this research, high confidence in meeting compliance requirements is key to a strong PKI security posture.

Only 46 percent of respondents have high confidence in PKI’s ability to meet compliance requirements. Less than half (48 percent) of respondents rate the effectiveness of their PKI in protecting against outside attacks and insider threats by ensuring a secure framework for authentication, encryption and data integrity as very or highly effective.

As shown in the research, a shortage of in-house expertise is reducing the effectiveness of the PKI infrastructure’s ability to scale up with growing devices and workload. Only 47 percent of respondents say the effectiveness is very high. Enabling secure transactions has the highest PKI effectiveness (53 percent of respondents).

A Summary of the State of PKI Security

 Public Key Infrastructure (PKI) is the backbone of digital trust, enabling secure communications and authentication for users, devices, and services. However, organizations today face mounting challenges in keeping PKI secure, reliable, and compliant.

Confidence and Effectiveness Remain Low

Most organizations lack strong confidence in their PKI’s ability to meet compliance requirements. Only 46 percent of respondents rate their PKI as highly effective for compliance, and less than half believe their PKI is very effective at protecting against threats or scaling with demand. The complexity of managing an average of over 114,000 certificates, combined with legacy systems and manual processes, undermines both security and reliability.

Key Barriers: Misconfigurations, Outages, and Visibility Gaps

The top obstacles to robust PKI security are:

  • Misconfigurations in PKI infrastructure (50 percent of respondents)
  • Unplanned outages from expired certificates (49 percent of respondents)
  • Lack of visibility into certificate inventory (38 percent of respondents)

These issues make it difficult to maintain compliance and increase the risk of security incidents. Legacy costs and risks, as well as failures in security, compliance, and audit processes, further complicate the landscape.

Manual and Infrequent Assessments

While 61 percent of respondents say their organizations regularly assess PKI security, most do so manually (53 percent) or via penetration testing (46 percent), and only a third conduct of these assessments weekly or biweekly. This infrequency leaves gaps where vulnerabilities can persist.

Real-World Consequences: Incidents and Outages

Poorly managed PKI and certificates have led to significant cybersecurity incidents:

  • Sixty percent of respondents say their organizations experienced exploits due to weak cryptography
  • Fifty-eight percent of respondents say their organizations suffered third-party certificate authority (CA)
  • Forty-three percent of respondents say their organizations reported server private key theft
    Unplanned outages are common: 56% had outages due to certificate expiration or configuration errors, often stemming from manual tracking and renewal processes.

Staff Shortages and Operational Burdens

Organizations typically dedicate only four staff to PKI management, and just 42 percent of respondents feel they have enough in-house expertise. Over half (55 percent of respondents) struggle to keep up with the growing use of cryptographic keys and certificates, leading many (63 percent of respondents) to outsource to managed security service providers.

The Push for Automation and Unified Visibility

Automation is increasingly seen as essential. Fifty-one percent of respondents say their organizations use automated certificate management, citing benefits such as consistent task execution, faster certificate renewal and greater visibility and control. Unified visibility across environments is now the top strategic priority, according to 34 percent of respondents, followed by hiring qualified personnel and reducing PKI complexity.

Best Practices of High Performing Organizations

Organizations with high confidence in their PKI (“high performers”) are more likely to adopt AI for predicting certificate issues and preventing outages, maintain better visibility into certificate inventory and support PKI with in-house expertise and effective remediation processes. These organizations report fewer operational burdens and stronger security outcomes.

In summary, PKI security is under pressure from complexity, manual processes, and resource constraints. The most effective organizations are those investing in automation, unified visibility, and skilled personnel—transforming PKI from a source of risk into a foundation for digital trust.

Part 2. Key Findings

Sponsored by CyberArk, Ponemon Institute surveyed 1,833 IT and IT security practitioners in North America (567 respondents), EMEA (503), Asia-Pac (401) and LATAM (362) who are knowledgeable about their organizations’ use of PKI and certificates. In this section of the report, we analyze the research results. The complete auditing findings are presented in the Appendix of the report. The report is organized according to the following topics.

  • Securing PKI and certificates
  • The deployment and management of PKI and certificates
  • Best practices of organizations that have high confidence in meeting compliance requirements (aka high performers)
  • Regional differences

Securing PKI and certificates

 Fifty-four percent of respondents say their organizations have little or only some confidence in their PKI’s ability to meet compliance requirements. The major reasons for the lack of confidence are misconfigurations in the PKI infrastructure (50 percent of respondents), unplanned outages caused by expired certificates (49 percent of respondents) and lack of visibility into certificate inventory (38 percent of respondents).

The biggest barrier to securing PKI and certificates are legacy costs and risks. Some 34 percent of respondents say legacy PKI costs and risks are affecting the security of PKI and certificates. Other barriers include the inability to have a centralized view of all internal certificate (31 percent), security, compliance and audit failures (29 percent) and dependence on manual certificate management (28 percent).

PKI security assessments are mostly manual and infrequent. While 61 percent of respondents say their organizations evaluate the security of their PKI infrastructure, only 33 percent of respondents say the evaluation occurs weekly (20 percent) or every two weeks (13 percent). According to Figure 4 the two tools most often used when assessing PKI security are manual (53 percent of respondents) and penetration testing (46 percent of respondents).

Assessing the effectiveness of processes for issuing, renewing, revoking and destroying digital certificates is used to determine the security of the PKI infrastructure.

According to 50 percent of respondents, their organizations examine the processes for issuing, renewing, revoking and destroying digital certificates to assess the security of organizations’ PKI infrastructure. This is followed by the evaluation of overall PKI architecture for vulnerabilities and potential misconfigurations (39 percent of respondents) and the review of procedures and protocols for managing data, responding to security incidents and training staff on PKI best practices (38 percent of respondents).

To see the rest of these key findings, visit CyberArk’s website.