デル・テクノロジーズ improved its Dell AI Data Platform. Now, organizations can turn scattered, siloed data into useful AI insights more easily. The Dell AI Factory offers a flexible platform for AI workloads. It separates storage from processing. This setup cuts down bottlenecks. It also supports tasks like training, fine-tuning, RAG, and inference.
Key improvements are the Dell PowerScale and ObjectScale storage engines. They provide strong security, high performance, and access to multiple protocols for AI data. PowerScale now integrates NVIDIA GB200 and GB300 NVL72 drives, supporting large-scale GPU environments with reduced space, power, and network requirements. ObjectScale offers scalable S3-native storage, up to eight times faster than previous generations, with S3 over RDMA providing higher throughput, lower latency, and lower CPU usage.
The platform’s Data Engine, developed with Elastic and Starburst, enables natural language data search, semantic search, RAG pipelines, and AI analytics across distributed datasets. NVIDIA cuVS speeds up hybrid keyword-vector searches. The new Agentic Layer boosts document automation. It also aids in insight extraction and integrates AI with LLMs.
こちらもお読みください: OpenAI’s Japan Economic Blueprint: Charting the Course for AI-Powered Change
These upgrades create a setup that is scalable, efficient, and secure for businesses using AI.