How conversational analytics removes the enterprise intelligence bottleneck
Cybersecurity firms face a paradox. Their clients preserve including extra safety instruments, anticipating extra safety. However the information more and more reveals that device sprawl makes organizations slower to detect and reply to threats. On the identical time, AI is accelerating each side of the equation: giving defenders new capabilities whereas making it dramatically simpler for attackers to function at scale.
For over twenty years, Barracuda has protected organizations from evolving threats with its BarracudaONE cybersecurity platform, which maximizes cyber resilience by unifying safety throughout electronic mail, ,information, networks, functions, and managed XDR. Barracuda makes use of Databricks for its enterprise information platform, consolidating fragmented information silos to energy ML operations, real-time risk correlation, and enterprise intelligence. Utilizing Databricks Genie, the crew shortly developed and launched options like pure language log seek for its managed XDR answer, permitting clients to question billions of safety occasions in plain language whereas sustaining strict information isolation.
Neal Bradbury is Chief Product Officer at Barracuda, liable for product administration, engineering, safety, and cloud operations. He has led the shift towards what Barracuda calls AI-native product growth, by which intelligence is constructed into the core of each utility moderately than added as an interface on high.
The thread working by means of our dialog was constant: in an period the place attackers function at scale, the defenders successful with AI are these treating their proprietary safety telemetry as a strategic asset. They don’t seem to be simply including AI instruments; they’re constructing intelligence straight into the info layer to remain forward of evolving threats.
What AI-native truly means
Aly McGue: How do you outline an “AI-native utility” in what you are promoting versus a conventional utility? What is the strategic distinction for the client expertise?
Neal Bradbury: For us, AI-native means it is inbuilt, not bolted on. The appliance have to be architected with AI at its core. In safety, which means observability, governance, entry controls, and enforcement, all inbuilt from day one. We’ve got our Bailey AI Assistant, however the core of how our functions work, whether or not it is our WAF or our electronic mail safety, they’re AI-native at their basis.
The opposite massive distinction is that AI-native functions constantly adapt. A conventional utility is constructed a sure manner, and it operates that manner till somebody goes in and adjustments it. An AI-native utility is extra dynamic. It responds to altering buyer information, altering wants, and altering targets. It meets the client the place they’re as issues evolve, which issues lots when the panorama is shifting as quick as it’s proper now.
In our case, we’re amassing threats and dangers from clients throughout the BarracudaONE platform. Each buyer has a distinct threat profile. Each buyer wants totally different threats prioritized. So it could’t be inflexible. That is actually the strategic distinction: an AI-native answer adapts to every buyer moderately than forcing everybody down the identical deterministic path.
Embedding Intelligence into the Safety Stack
Aly: What did it take to re-architect your core product and embed AI-native options like personalization, advice engines, or copilot instruments?
Neal: I might return to our managed XDR answer for example. We needed to actually query the main focus and objective of that providing, after which work backward. What drawback are we truly fixing? What consequence are we delivering for the client? Any product supervisor ought to begin there, but it surely turns into much more essential once you’re embedding AI, as a result of the structure selections you make early decide what’s doable later.
The foundational piece was organizing the info layer. In case your information is in every single place or the schema is not shared, it simply causes issues downstream for every part. With the ability to normalize the schema enabled our machine studying fashions and brokers to have full context throughout domains and really to do what we wanted them to do.
We had been additionally disciplined about taking small bites. We did not attempt to migrate every part directly. We began with small items, iterated, and labored our manner towards the total consequence. You’ll be able to give you a fancier technique to describe it, but it surely was: perceive what the output must be, then iterate your manner there.
What got here out of that course of was real-time streaming detection constructed with notebooks, ML operations working by means of MLflow, and a number of machine studying fashions with 30-plus options that constantly enhance. And the thrilling half is that we have been capable of lengthen that very same platform sample to different merchandise: our WAF-as-a-service, our automated configuration engine, API safety, and superior bot safety. So the funding compounds.
Aligning groups round outcomes, not instruments
Aly: How did you efficiently align product, information science, and engineering groups to work from a shared information and AI platform to speed up time to marketplace for these options?
Neal: I will sound like a damaged file, but it surely actually got here all the way down to defining shared outcomes first. Take our impersonation safety function in Barracuda E-mail Safety, which protects clients in opposition to superior assaults. The result wasn’t easy, but it surely was clear. And that readability meant groups may drive towards a unified objective with out getting misplaced in tooling debates. We had Databricks because the platform, we had a vacation spot, and we may simply execute.
The identical logic applies after we work throughout non-engineering capabilities. Once we went after churn discount, we wanted buyer data, product telemetry, and gross sales information. With the ability to deliver all of that collectively in a single enterprise information platform and really see cross-functional insights is what drove alignment. It wasn’t a mandate from the highest. It was a shared consequence that everybody may see and measure. That is what strikes folks.
Why your information layer is the actual differentiator
Aly: How does constructing AI-native functions by yourself information layer provide you with a deeper, extra defensible aggressive benefit in comparison with relying solely on exterior SaaS fashions?
Neal: Your individual information layer is the differentiator. Full cease. AI brokers are solely as sturdy because the proprietary, context-rich information they will entry. Once you construct in your unified safety telemetry, you create a bonus that generic SaaS fashions simply cannot replicate.
As a result of we construct on our personal information, we are able to customise for the particular telemetry and insights we’re getting throughout your complete safety portfolio. That lets us present focused suggestions and make selections alongside our clients in ways in which a one-size-fits-all exterior mannequin by no means may.
The way in which I give it some thought is that this: an AI-native product can use customer-specific deployment and habits context to adapt and reply in methods an exterior SaaS AI merely can’t. And that benefit compounds. The extra information flows by means of the system, the higher it will get at understanding every buyer’s distinctive setting. No person can shortcut their manner into that.
Closing Ideas
What got here by means of most clearly on this dialog is that AI-native is an architectural dedication, not a function label. Neal attracts a line between merchandise which have AI designed into their basis and merchandise that add an clever interface on high of a conventional system. The distinction reveals up in how dynamically the product adapts, how effectively it makes use of proprietary context, and the way defensible the result’s over time.
For executives evaluating their very own product methods, the query value sitting with is: Is intelligence constructed into the core of what you ship, or is it layered on high? The reply determines not simply what your product can do right now, however how briskly it could evolve when the panorama shifts once more.
To study extra about constructing an efficient working mannequin, obtain the Databricks AI Maturity Mannequin.
