Hitachi, Ltd. has started supporting JGC Global Corporation in building and running a proper Data Quality Management setup from April 2026. The focus is simple. If the data is messy, AI will stay unreliable. It is inescapable.
These two companies have been partners on this project since the project ever started. The team created a current state assessment of JGC operations which included a target model and a gap closure strategy. The method uses international standards such as DMBOK and ISO 8000 but requires development work to create practical applications for actual implementation. The organization needs to establish precise data quality standards which will be followed by monitoring activities and the creation of a system that will sustain ongoing development rather than shutting down after its initial implementation.
A lot of companies are hitting the same wall right now. AI is being pushed everywhere, but results are inconsistent. Outputs are off, accuracy is shaky, and trust drops quickly. In most cases, the issue is not the model. It is the data underneath. Different formats, unclear definitions, uneven quality. It adds up.
Also Read: Sactona Launches New AI Anomaly Detection Tool
JGC is trying to clean that up across its global operations, especially as it pushes AI and IoT deeper into its EPC business. Hitachi is bringing in its experience from multiple industries to help structure this properly.
The interesting part is the way they are approaching improvement. It is not just a slow, checklist-driven model. They are combining structured cycles with faster feedback loops so issues can be spotted and fixed as they show up.
If this works, the outcome is not just better data hygiene. It is more reliable AI across projects, and fewer situations where teams are second guessing the output they get.


