Industry Updates

'SAMENA Daily' - News

NEC launches its own generative AI

Japanese vendor NEC has joined the growing list of companies aiming to capitalise on AI hype by launching three new offering.

Perhaps the most significant of these is NEC’s very own large language model (LLM), specifically targeted at the Japanese market. The vendor says it operates with a high degree accuracy using a relatively small number of parameters – 13 billion to be precise. By comparison, OpenAI’s GPT-4 reportedly uses a trillion.

Parameters are important because these are the variables within the LLM that direct the AI to a particular answer. Think of it in terms of neural pathways in human anatomy. As humans, we establish and maintain a network of neural pathways by interacting with the world. Similarly when LLMs are trained, the ‘strength’ of an AI’s neural pathway is determined by numerical values assigned to parameters. These values automatically adjust as the AI learns, helping to fine tune its responses.

A greater number of parameters means there is more room to accommodate nuance and complexity. However, it also comes with a heavier burden in terms of compute and energy consumption. This is what makes NEC’s LLM interesting, because it uses less power and computing resources, enabling it to be deployed in the cloud – including on-premises environments – rather than relying on external data centres or supercomputers.

This brings with it another advantage, one pertaining to data security. Having an LLM deployed on closed-off, company-owned hardware mitigates the risk of so-called data leakage. This is when staff unwittingly share sensitive corporate data by giving it to an AI that then adds it to its knowledge base, ready to be passed on – possibly even to rival companies.

In April, it emerged that Samsung employees had accidentally leaked trade secrets via ChatGPT. A month later, following an internal investigation, Samsung banned staff from using it altogether.

NEC said it began using its in-house generative AI for internal business use in May. The system is linked to various internal systems such as employee chat and Web conferencing tools, and is being used by about 20,000 users, or about 10,000 times per day.

NEC said it has led to an 80 percent reduction in the number of man-hours required to create source code for internal system development. It has also led to a 50 percent reduction in the time required to create documents, and a reduction in the time required to take meeting minutes from an average of 30 minutes to approximately 5 minutes.

Now NEC wants to monetise it. Its new Generative AI Service will offer licences for its proprietary LLM, which can be customised according to a customer’s requirements. It will also provide dedicated hardware, software and services to help businesses get the biggest bang for their buck.

Alongside the LLM, NEC has also launched the Generative AI Advanced Customer Programme, a service that supports enterprises in the creation of their own LLMs, as well as the development of software, and training, to help customers actually put them to productive use.

The other initiative is the Generative AI Hub, a specialised organisation that showcases the benefits of generative AI to potential customers. It is staffed by a team of researchers, prompt engineers – who provide precise instructions to AI – consultants, and digital trust specialists.

NEC aims to generate sales of JPY50 billion ($346 million) from its AI activities over the next three years.



Source: https://telecoms.com/522603/nec-launches-its-own-generative-ai/

ATTENTION