By Stephanie Simone
AI is only as good as the data it’s built upon, and as the AI era continues to evolve, the organizations that succeed won’t just be the ones that have more data—they’ll have better data products.
Building better data products requires well-structured and governed data pipelines. It also requires systems that can leverage AI to learn, adapt and optimize—often in real-time—to continuously deliver fresh, trusted data.
DBTA recently held a special roundtable webinar, Building Better Data Products with AI, For AI, with experts who dove into the essentials for building better data products.
Irem Radzik, senior director of product marketing, Reltio, said siloed data is the kryptonite of digital and AI transformation goals. Trusted, rich data assets are a competitive advantage in the age of AI. And data products enabling AI agents cannot have “good enough” data.
Reltio Data Cloud provides a trusted, secure system of context, she explained. The platform is made up of Reltio AgentFlow, which are prebuilt agents for data governance and workflows, and Reltio MCP Server, which is the bridge between the data and agents.
According to Robert Stanley, senior director of special projects at Melissa Informatics, Melissa Informatics provides software and services for AI-enabled data quality, data discovery, harmonization, integration, and research.
Melissa Unison is “Melissa’s well-trained application expert.” Unison accesses and applies application functions, business rules, and reference datasets. Customer benefits include:
- Reduced barrier to entry and value: less expertise and training required for new customers and products. Guidance and support to discover and connect to data sources, assign content to correct fields, suggest next actions.
- Improved ease-of-use and efficiency: Assists in understanding valued data and options for DQ and enrichment (opt-in/opt-out). Data analysis and results-based rule/process suggestions; access methods/resources that may be unknown to customers. Persisted learning, including preferred DQ workflows, for recurring value, ease of use, etc.
- Rely on data quality expertise for high quality data products
- Automated and up-to-date
Connecting to live data sources is difficult and dangerous, said Deepak Vittal, field CTO, insightsoftware, Data + Analytics. AI agents need access to live data for context and accuracy. Data teams must ensure governance and security.
Without data access, AI hallucinates, Vittal noted. AI needs access, not copies and without governance there is no access. AI requires data prep.
He introduced Simba Intelligence, “The AI semantic platform for trusted intelligence.” It offers governed, contextual, and verifiable data access for AI that eliminates hallucinations at the source.
Jerod Johnson, senior technology evangelist, CData Software, agreed that AI is only as good as its data.
Technical requirements for AI-ready architecture need to provide:
- Live, read/write access
- Fine-grained governance
- Scalable data movement
The CData Connectivity Platform is a leading connectivity framework packaged to handle every integration use case, Johnson said.
For the full webinar, featuring a more in-depth discussion, Q&A, and more, you can view an archived version of the webinar here.