Welcome to another edition of Talsco Weekly
- IBM i Brief: IBM: Previews Generative AI tool for converting COBOL to Java. Will IBM bring Code Assist to IBM i?
- AI: Possible Paths for AI on the IBM i. AI Chip Boom. How will AI and ML be used in the Data Center? What are AI Accelerators?
IBM i Brief
IBM: Previews Generative AI tool for converting COBOL to Java
IBM this week announced it will preview a generative artificial intelligence (AI) tool for converting COBOL code running on mainframes into Java.
What does it do?
It improves the syntax of Java code generated using object-oriented techniques.
The goal, however, is not to move COBOL applications off the mainframe but rather to make mainframes more accessible to a wider pool of developers who know Java, which should increase the overall pace of innovation for application development.
Will IBM bring Code Assist to IBM i?
“IBM Research and Big Blue’s Software Group have been collaborating to bring generative AI capabilities to market through the Watsonx stack of large language models and related tools.”
What is Watsonx?
It is a stack of Large Language Models where IBM is venturing further into the domain, providing capabilities similar to OpenAI’s GPT series but with a focus on customization for individual businesses. This is an evolution of IBM’s AI efforts, building upon their Watson platform.
As usual, IT Jungle does a deep dive into this possibility.
AI
Possible Paths for AI on the IBM i
The IBM i community is starting to experiment with AI. There are certain aspects of the emergence of AI in our lives and work that make it a little unsettling.
Just because something is unfamiliar and initially intimidating doesn’t mean it lacks value; with closer examination, its potential benefits and applications might emerge. [block quote]
This week, IT Jungle breaks down Profound Logic’s exploration of the possibilities for AI. While they are focused on Profound’s products, they offer context as to how AI can be leveraged throughout the IBM i ecosystem.
The three topic areas are:
- AI Co-Pilots: A tool that uses artificial intelligence to assist developers by providing code suggestions, enhancing productivity, and improving code quality as they write software.
- AI Plugins: They incorporate artificial intelligence capabilities into existing software applications or platforms.
- AI Chatbots: Software programs powered by artificial intelligence that can simulate a conversation with users, typically via text or voice interactions, to assist, inform, or entertain.
Takeaway: The IBM i server might not be leading the charge in AI innovation, but that doesn’t negate its relevance.
It’s hard to question the growth of AI because there is a race between Nvidia, AMD, and Intel.
Nvidia’s Data Center Revenue has quadrupled over the past two years, outpacing both AMD and Intel. “The company achieved dominance by recognizing the AI trend early, becoming a one-stop shop offering chips, software, and access to specialized computers.”
How will AI and ML be used in the Data Center?
“Artificial Intelligence (AI) and Machine Learning (ML) continue to make great strides in their evolution, and they are now having a tangible impact on data center operations and IT management.
“We are seeing AI and ML applied to functions that range from power and cooling to resource management and allocation. To that end, we have seen data- and algorithm-driven technologies deployed in areas such as fast failure detection/prediction, root cause analysis, power usage optimization, and resource capacity allocation optimization; all in the quest to ensure that data centers are operating as efficiently as possible.”
AI chips, often referred to as AI accelerators or hardware accelerators, have become an integral part of modern data centers due to the explosive growth in AI workloads. These AI-specific chips are designed to accelerate AI-related tasks like deep learning, machine learning, and neural network processing.
In other words:
“An AI accelerator is a dedicated processor designed to accelerate machine learning computations. Machine learning, and particularly its subset, deep learning is primarily composed of a large number of linear algebra computations, (i.e. matrix-matrix, matrix-vector operations) and these operations can be easily parallelized. AI accelerators are specialized hardware designed to accelerate these basic machine learning computations and improve performance, reduce latency and reduce cost of deploying machine learning based applications.”
Here are a few ways AI Chips are being used:
Accelerated Training: Training deep learning models can take weeks or even months using conventional CPU-based systems. AI accelerators, especially GPUs (Graphics Processing Units) and dedicated AI chips, can accelerate this process, reducing training times significantly.
Inference Acceleration: Models can be trained and used to make predictions on new data, a process known as inference. AI chips speed up the inference process, allowing for real-time or near-real-time insights.
Reduced Power Consumption: Traditional CPUs are general-purpose and not optimized for AI workloads. AI-specific chips can perform AI computations more efficiently, which leads to energy savings. This is a critical factor for data centers, where energy costs and heat dissipation are primary concerns.
Distributed and Edge Computing: Some data centers support edge computing, where computation happens closer to the source of data (e.g., IoT devices). AI chips, especially those optimized for low power, are essential in these scenarios as they offer fast insights without the need to transmit vast amounts of data to a central data center.
Join
Sign up for Talsco Weekly to get the latest news, insight and job openings for the IBM i professional.
Contact us
If you are an RPG programmer looking to explore opportunities or a client who is looking for a talented IBM i professional, please contact us. We look forward to assisting you.
Share
Do you know of someone who could benefit from Talsco Weekly? If so, please use the social media buttons to spread the word. Thank you!