/dq/media/post_banners/wp-content/uploads/2024/02/Digital-transformation.jpg)
During the Red Hat Summit, Red Hat reported that Hitachi, Ltd. (TYO: 6501) rolled out an internal AI platform using Red Hat OpenShift AI, as part of a strategy for deploying and governing artificial intelligence organization wide.
With more than 250 active projects on the platform, Hitachi anticipates the platform to eventually manage generative and predictive AI models at scale, across both its IT and operational technology touch-points. This deployment supports the company's efforts to leverage AI for core functions, including call centre operations and more nimble software development processes.
From Hitachi's perspective, OpenShift AI has provided more consistency in monitoring how AI is being used internally, including performance management and governance standards. Since then, the company stated it has seen kpi development time improvements and better internal team collaboration.
Hitachi's current focus is on internal applications for AI, but it is exploring the platform's broader applicability to its businesses as the organization looks to artificial intelligence in its own transformation. This is also being considered alongside other organizations in multiple industries grappling with an enterprise approach to adopting AI.
“We’re not just testing AI—we’re applying it in real operational contexts,” said Masahiro Kikuchi, Director, Platform Service Department at Hitachi, Ltd. “Using Red Hat OpenShift AI helped us build a platform that integrates into our day-to-day systems.”
Red Hat’s broader messaging at the summit has focused on enabling AI flexibility — supporting a variety of AI models across different hardware accelerators and cloud environments.
Red Hat’s Vision for AI: Any Model, Any Accelerator, Any Cloud
Red Hat believes the future of AI lies in flexibility and scalability — the ability to deploy any model, on any accelerator, across any cloud. This universal inference approach will allow organizations to unlock greater value from their AI investments while avoiding vendor lock-in and infrastructure limitations.