Generative AI | News, how-tos, features, reviews, and videos
Office environments need to change to foster collaboration, and employers need to close the AI skills gap, Cisco reports in its hybrid work study.
HPE Aruba is using proprietary LLMs to better understand questions posed in its Networking Central platform and generate more accurate, detailed responses.
The facility, with sites in the US and South Korea, will develop chips to support the processing demands of ‘artificial general intelligence,’ which refers to AI that can perform as well as or better than humans.
As part of its extended collaboration with AWS, GCP, Microsoft, IBM, and Oracle, the chip designer will share its new Blackwell GPU platform, foundational models, and integrate its software across platforms of hyperscalers.
New servers and storage services are targeted at high performance workloads, which means AI.
The partnership will give joint customers a CPU platform on which to run computationally intensive AI workloads.
Once optional, GPUs are becoming mandatory in servers. Companies are prioritizing investment in highly configured server clusters for AI, research firm Omdia reports.
Supermicro and Lenovo are expanding their AI hardware offerings, Intel is previewing chips designed for 5G and AI workloads, and Dell is embracing telecom.
IBM is focusing its AI initiatives on the business case instead of trying to get customers to focus on AI, and that's what sets it apart.
The nonprofit IT certification and training organization is creating new products and programs to address the growing AI jobs market.
Sponsored Links