KDDI Eyelet has rolled out a GPU procurement and AI inference support service built on Akamai Cloud, starting April 13, 2026. Strip the noise and this is about one thing. Getting companies past the ‘we’re testing AI’ phase and into actual production without blowing up budgets.
Currently, most firms hit the same wall. GPU costs are unpredictable, infrastructure is messy, and vendor lock-in kills flexibility. KDDI Eyelet is trying to clean that up with a bundled approach. You get GPU access, platform setup, and ongoing operations in one pipeline instead of stitching vendors together.
The pricing angle is the hook. Entry-level GPU usage starts around 80 yen per hour, which undercuts typical cloud GPU rates. Add to that low latency performance and regional deployment in Japan, and it starts addressing both cost and compliance concerns.
Also Read: Rakumo begins selling Gemini Enterprise, Google Cloud’s AI generation service
The bigger play sits in the background. Distributed cloud models like Akamai’s are gaining traction as companies look for alternatives to centralized hyperscalers. If this works, it nudges the market toward more flexible, multi-cloud AI infrastructure.
Net net, this is less about a new service and more about removing friction that’s been quietly slowing down enterprise AI adoption.


