In a development for enterprises adopting artificial intelligence, Scality has shared findings from independent research conducted by Freeform Dynamics. The study focuses on the shift as organisations move AI initiatives from experimentation to operational deployment. One key finding is that data and storage infrastructure can become constraints in production environments.
As AI adoption progresses, there is an increasing trend towards deploying private, or sovereign AI. This approach enables organisations to maintain control over the infrastructure supporting their AI models and data, with potential benefits for performance, regulatory alignment, and long-term cost management. As infrastructure costs decrease, private AI is increasingly considered alongside cloud-based AI services.
Object storage is identified as an important component within AI environments, particularly as deployments move into production. Infrastructure decisions are increasingly shaped by requirements for reliable performance, governance, cost control, and lifecycle management.
While GPUs and compute capacity are often central to AI infrastructure discussions, the research highlights a focus on systems that stage, govern, protect, and reuse data across inference-driven pipelines.
Key findings from the research include:
- Adoption of Object Storage: 91% of enterprises using private AI report significant use of object storage, the highest among storage architectures.
- Infrastructure Sovereignty: 81% of organisations indicate that control over private AI infrastructure is important, driven by sovereignty and compliance considerations.
- Performance Prioritisation: 57% of enterprises prioritise storage performance to address potential bottlenecks, compared with concerns around compute and network bandwidth.
- Hybrid Architecture Preference: Many organisations adapt existing compute and storage infrastructure while also using purpose-built components, reflecting the use of hybrid architectures.
- Metadata Handling Challenges: Some organisations identify metadata management at scale as a bottleneck, indicating a need for storage platforms that support these requirements.
The survey reflects enterprises operating AI in production environments rather than pilot phases. As AI workloads become more inference-driven, tiered architectures are being adopted. These typically combine high-performance tiers for active workloads with larger capacity tiers for data retention and reuse. Within these models, S3-compatible object storage plays a key role in supporting AI pipelines.
Scality’s approach highlights the use of scalable architectures that support data proximity, accommodate multiple AI models, and balance performance with governance requirements. This aligns with approaches to sovereign AI development, supporting organisations in scaling their operations while maintaining oversight.