The Triple DNA Helix of AI at the Edge

Kevin Hannah, Director of Product Operations for Kazuhm, explores why artificial intelligence should take center stage at the edge and why our ability to process the tsunami of information that is coming at us from 5G, IoT, and Big Data is going to depend on how successfully artificial intelligence is deployed at the edge in the blog below and in Amelia Dalton’s ‘fish fry’ podcast from the EE Journal that you can access here.

As Neo was told “that is the sound of inevitability” so too are organizations when it comes to both AI and the Edge. But inevitable as it is, if we are to see the delivery of tangible business value rather than just continuing to read articles espousing lofty promises of what will be, we need to understand the three complimentary entwined strands of what makes AI at the Edge both possible and more importantly financially viable.

AI Applications are the obvious end-user manifestation of AI at the Edge. But why focus on AI rather than one, or many, of the other technology darlings such as AR, VR, and Autonomous Driving? All are perceived to deliver value at the Edge based on their need(s) for low latency performance, reduced movement of data, either for bandwidth reduction or compliance jurisdiction/sovereignty, survivability, and reliability.

The business case for AI is simply an extension of the tidal wave of Business Intelligence and Analytics associated with all things Big Data. And that is the key. Massive data volumes generated by next generation connected Internet of Things (IoT) devices continues to grow exponentially.

AR/VR are cool to demonstrate but have offered little to organizations in terms of real revenue gain and Autonomous Driving is going to face numerous uphill struggles against regulatory adoption.

But the use of AI, trained using Machine Learning (ML) algorithms, on data at the Edge is easy to grasp in terms of immediate business benefit – insights generated, and immediate actions taken where the data is produced rather than having to rely on distant, centralized, cloud resources. This is no more evident than in Manufacturing where high precision manufacturing and robotics require AI located on premises to ensure real-time responsiveness, while connected machines and sensors provide new insights into predictive maintenance and energy efficiency across disparate geographic locations in pursuit of improving operating profit.

However, the Edge is a continuum stretching from the IoT device layer, through the Access Edge “last mile” layer, to the Infrastructure Edge data center layer, with aggregated data ML seamlessly picking up where work at the device leaves off. Ultimately, providing opportunity to improve scalability and performance by placing AI at an optimal location in the Edge topology.

And it is this AI-as-a-Service sitting at the network edge that represents a key monetization opportunity for Communication Service Providers (CSPs).  It allows them to move away from selling undifferentiated basic bandwidth services, become relevant in the developing AI Application ecosystem, and drive new revenue. This is a time-sensitive endeavor as the major public cloud providers look to extend their reach in reaction to the “edge will eat the cloud” (Gartner).

Edge Infrastructure is the domain of the CSPs who as we have discussed are leveraging their network infrastructure as a rich, functional platform for AI applications. Ownership of access networks and edge cloud infrastructure gives them a competitive advantage over public cloud providers particularly in the 5G era. And without 5G there will be network problems in not only providing connectivity for the billions of anticipated IoT devices but also for transmitting the huge volumes of data that will be generated.

Out of 5G is born Software-defined networking (SDN) designed to make networks more flexible and agile through Network Function Virtualization (NFV), and Mobile Edge Computing or Multi-Access Edge Computing (MEC) in the form of what is essentially a cloud-based IT service environment at the edge of the network.

A set of standardized compute resources are provided, both CPU and GPU, running cloud native applications and orchestration to mimic the platform simplicity, API familiarity, and developer comfort of the cloud. But within the 5G networks, these resources reside on a playing field differentiated by location… a game the CSP can win.

So, with companies such as NVIDIA looking to Edge-located GPUs in support of AR, VR, and Connected Gaming over this standardized 5G infrastructure, although not a direct use for AI as mentioned earlier, these resources can be recaptured (when idle) as a powerful accelerator of AI training algorithms.

And back to the billions of anticipated IoT devices such as mobile phones, whose compute resources inside are becoming increasingly powerful. They can now enable Federated Learning as a privacy-preserving mechanism to effectively leverage these decentralized compute resources to train ML models coordinated through these other Edge-located ML resources.

A complete, connected, ecosystem hosting AI stacks for both the CSP and their clients/partners offers the opportunity to rethink business models and how to participate in value creation, value distribution and value capture. Here, effective participation is the key to monetizing network infrastructure.

AI-Enablement is the use of the AI stack by the CSP for automated workload orchestration, the underpinning for provisioning and managing services and applications at the Edge.

This means the Edge itself becomes more intelligent. Making it not only relevant for low latency applications but offering potential to unlock highly intelligent and secure opportunities, data transmission efficiencies, traffic steering, zero-touch service management, and optimal workload (including Virtual Network Function, VNF) placement; a smart way to handle the right workload, on the right resource, for the right reason whether that be cost, performance, security/compliance, routing, or even reliability.

AI will be critical to network automation and optimization, with real-time decisions needed in support of traffic characterization, meeting end-to-end quality of service, and in particular – Dynamic Network Slicing that allows CSPs to monetize their infrastructure by offering multiple service tiers at different price points. For example, a slice of the network to handle certain floor robotics that rely on ultra-low latency may garner a higher price than a parallel slice for less time-sensitive edge compute.

The DNA of AI at the Edge is now starting to form. Time will tell as to who will endure (through financial success) to pass theirs to a next generation where AI functionality is so completely decoupled and disseminated so broadly that it will seem to disappear altogether.

Want to hear more? Listen to Amelia Dalton’s podcast ‘fish fry’ from the EE Journal featuring Kevin Hannah, Director of Product Operations for Kazuhm, at the link below.

The Curious Case of the Critical Catalyst – Why Artificial Intelligence will be the Darling of the Edge


Kazuhm Wins 2019 NAB Show Product of the Year Award

Next-Generation Workload Processing Platform Recognized for Achievements and Innovation in IT Networking/Infrastructure and Security 

San Diego, CA – April 17, 2019—Kazuhm today announced that it received a Product of the Year award at the 2019 NAB Show—a program that aims to recognize the most significant and promising new products and technologies showcased by exhibitors. Kazuhm was specifically recognized in the IT Networking/Infrastructure and Security category.

Kazuhm is a workload processing platform that allows organizations to recapture existing IT resources and intelligently manage work across a fabric of desktops, data centers, cloud, and edge/Internet of Things (IoT). The intuitive user interface (UI) is easy to operate, helping to limit an organization’s dependence on IT support. Enterprise customers typically see such benefits as lower IT and cloud costs, enhanced security, improved performance, and reduction of latency by enabling the right work to be processed on the right resource for the right reason.

“Since our company’s inception, we’ve seen tremendous excitement and opportunity to apply our best-in-class solution to telecommunications, media and entertainment. These industries often require low-latency solutions for delivering entertainment services, or compute resource-heavy applications to support transcoding and rendering,” said Tim O’Neal, Kazuhm CEO. “Regardless of the use case, Kazuhm is well-equipped to meet companies’ growing needs in the space, as our secure, AI-enabled platform delivers optimal compute workload placement and processing across compute resources.”

Adds O’Neal, “We’re honored to be among the first class of technology providers to win an NAB Show Product of the Year award, and hope to harness the present momentum to serve this robust and evolving market.”

NAB Show Product of the Year award winners were selected by a panel of industry experts in 16 categories and announced at an awards ceremony and cocktail reception at the Westgate Las Vegas Resort on April 10. To be eligible for an award, nominated products and technologies needed to be on display at the 2019 NAB Show for the first time and available for delivery in calendar year 2019. Additional details can be found here.

“Nominees like Kazuhm are revolutionizing the way people experience media and entertainment,” said NAB Executive Vice President of Conventions and Business Operations, Chris Brown. “The 2019 NAB Show Product of the Year Awards highlight the best of what’s new at the premier launchpad for breakthroughs at the intersection of media, entertainment and technology.”

To learn more about Kazuhm, including its proprietary video transcoding application, which debuted at NAB this year, please visit


Kazuhm Named a 2019 “Cool Company” by San Diego Venture Group

First-of-its-Kind Workload Processing Platform Joins 32 of the Fastest Growing, Most Exciting Startups in Southern California

San Diego, CA – April 2, 2019—Kazuhm, a next-generation workload processing platform, today announced that it has been recognized as one of only 33 “Cool Companies” for 2019 by San Diego Venture Group. Kazuhm won out amongst more than 250 applicants.

San Diego Venture Group (SDVG) promotes the formation, funding, and development of innovative new ventures in the San Diego community. SDVG’s Cool Companies list highlights the fastest-growing, most exciting startups in Southern California.

“We are grateful for this recognition from San Diego Venture Group,” said Kazuhm CEO Tim O’Neal. “We are proud to call ourselves a member of the San Diego startup community, which we strongly feel is one of the most active, thriving and important startup communities in the United States today.”

Launched in October of 2018, Kazuhm is a commercial-grade distributed computing workload processing platform that empowers organizations to maximize all compute resources across desktop, server, and cloud, all the way to the Edge. The platform enables organizations to recapture and use all existing compute nodes to process containerized workloads, saving IT costs and enhancing performance and security.  Its desktop recapturing technology is used by organizations across telecom, healthcare, retail, financial services, higher education and more.

“We were happy to name Kazuhm a 2019 Cool Company,” said SDVG President Mike Krenn. “One of the reasons why we love doing the annual ‘Cool Companies’ list is because it shows how the extremely diverse San Diego tech ecosystem is now a hotbed for all kinds of innovation, in an array of key areas.”  

This news comes on the heels of Kazuhm’s recent announcement that it has joined the NVIDIA Inception Program. Inception nurtures dedicated and exceptional startups who are revolutionizing industries with advances in AI and data science.

Kazuhm was also recently named an official nominee in the first-ever NAB Show “Product of the Year” Awards. NAB Show is the world’s largest event focused on the intersection of technology, media, and entertainment. Kazuhm will be exhibiting at NAB (N2739 – The Startup Loft) from Saturday, April 6 through Thursday, April 11.

This year’s Cool Companies event will be on April 30, at the Belly Up Tavern in Solana Beach, and will give participants, like Kazuhm, an opportunity to meet with more than 60 venture capital firms and 20 local investors.