This article is mostly non-technical, additional links are noted for further study.
https://www.jotrin.com/technology/details/a-brief-history-of-the-development-of-ai-chipshttps://en.wikipedia.org/wiki/Artificial_intelligence
Dr. B. K. Bose (A pioneer in advanced power electronics) wrote it in a simple word to me as follows:
“AI IS ARTIFICIAL EMULATION OF HUMAN THINKING BY A COMPUTER. OUR BRAIN HAS NEURAL NETWORK (BRAIN NERVOUS SYSTEM) THAT GIVES HUMAN INTELLIGENCE. IN COMPUTER, WE CAN HAVE ARTIFICIAL NEURAL NETWORK (ANN) THAT CAN BE TRAINED LIKE THE NEURAL NETWORK OF OUR BRAIN. SO COMPUTER ACTS LIKE A HUMAN BRAIN WHICH HAS INTELLIGENCE”
MACHINE LEARNING == WHERE COMPUTER IS THE MACHINE. LEARNING IS TRAINING OF MACHINE OR COMPUTER NEURAL NETWORK.
AI = COMPUTER INTELLIGENCE LIKE A HUMAN BEING.
More of his work can be found in AI for power electronics applications.
https://usbengalforum.com/ai-for-power-electronics/
I intend to write this article in a discussion format, please add your comment at the end of the article and submit your message to other viewers.
As an Engineer, I always thought that AI was a daily affair in our lives. Word AI didn’t resonate in my mind. Nonetheless, Artificial Intelligence (AI) is all of a sudden a household buzzword. It persuades me to read articles on AI from various websites. During the last 70 years, many forms of software have assisted our lives. We keep ourselves warm not by turning the valve to control temperature, instead hardware and software-driven system maintains our cozy lives. Aren’t those fall into the category of so-called AI, maybe not. I look for the present day’s appropriate definition for AI. There are many shades or meanings to AI, and realize presently most people are talking AI about generative AI. With full agreement with the naming let us continue my discussion. I probably named it “distributed client/server architecture”. Probably it would have minimum legislative rangling or issues than now. Most of the legislative issues are application-oriented, not the AI fundamental idea. I spoke to a few seniors (non-technical) about AI, and some of them are disturbed and urged to serious legislative restrictions and pointed out that this technology will create machines that would be clones of the human species and eventually destroy us. To quench the fear I told them AI is just an upgraded methodology of server/client in computer processing, with no competition to our brain power.
The semiconductor industry is at the forefront of AI, and it is going through a rapid change to develop AI chips. Chip design due to the complexity of generative artificial intelligence (AI) involves challenges for constraints such as footprint (Space), speed of parallel operation, heat dissipation, etc. Decision-making for output will require sophisticated software to be able to run on the hardware platform. Hardware and software will need a quantum leap, now is the time for a groundbreaking point, and the success of AI will depend on the right combo.
Various parts of Artificial intelligence will transform our world, such as the idea of machine learning (ML) and generative AI serve very different purposes and operate in their domain. Machine learning focuses on building systems capable of learning from data, and identifying patterns, Generative AI creates new content of text, images, and video mimicking human creations.
Incidentally, I worked on the idea of machine learning (software-driven machine) to calculate the derivative constant continuously instead of a fixed derivative constant for closed-loop PID control. I consider my PID software to be simple machine-learning software. Doesn’t client-server software architecture and tree nodes idea fall under AI?
How complex AI works: I am noting the following from a Google search.
Generative AI models often utilize neural networks, particularly a type known as Generative Adversarial Networks (GANs), Variational Autoencoders (VAEs), or Large Language Models (LLMs). Here’s a simplified breakdown of the process:
- LLMs: These models, GPT-4o, LLAMA, or Google Gemini, are trained on vast amounts of text data and can generate human-like text by predicting the next word in a sentence.
- GANs: Consists of two neural networks—a generator and a discriminator that work in tandem. The generator creates new data instances while the discriminator evaluates them.
- VAEs: Use probabilistic models to generate new data, allowing for the creation of diverse and final outputs based on learned representations of the input data.
- Chatbots and Virtual Assistants: Tools like ChatGPT can generate human-like text.
- Deepfake Technology: AI can create highly realistic video and audio recordings that appear to be real, raising both exciting possibilities and ethical concerns.
I believe that Machine learning and generative AI can complement each other in powerful ways for processing. This technique will take AI to a different level mimicking the human brain as the machine will correct itself and make a better decision-maker. But it will never come close to collective human power.
NEURON:
Neurons are the fundamental building blocks of (ML) machine learning. Each neuron has many inputs and results in a single output.
Output goes as input to Another layer of neurons as its input and this goes on and on.
A neural network is a type of AI that uses ML processes to teach computers how to process data mimicking the human brain. Incidentally, our brain has close to 100 billion neurons (nerve cells).
Artificial neural networks (ANNs) are made up of artificial neurons that work together to solve problems.
Each neuron and its unique knowledge convey and are interconnected with similar types of neurons. Neurons in one layer pass information to neurons in the next layer, and the output of a neuron can drive the inputs of many other neurons.
A neural network teaches and processes data to the computer. It is a type of ML process, called deep learning, that uses neurons in a multi-layered structure that mimics the human brain.
It creates an adaptive system that computers use to learn from their mistakes and improve continuously. Thus, artificial neural networks attempt to solve complicated problems with greater accuracy.
AWS, an earlier user of AI methodology. OpenAI, chatbot, perplexity, and Chat GPT are commonly known buzzwords associated with AI technology.
Interested in ChatGPT click this link
https://openai.com/index/chatgpt/
Since AI structure and ideas are similar to the human brain, we should try to understand how the brain works.
FRONTAL CORTEX:
Frontal Cortex: The cerebral cortex is the outermost portion of the brain, containing billions of tightly packed neuronal cell bodies that form what is known as the gray matter of the brain. The white matter of the brain and spinal cord forms as heavily myelinated axonal projections of these neuronal cell bodies. The cerebral cortex has four major divisions known as lobes: the frontal, temporal, parietal, and occipital lobes.
The basic workings of the nervous system depend a lot on tiny cells called neurons. The brain has billions of them, and they have many specialized jobs. For example, sensory neurons send information from the eyes, ears, nose, tongue, and skin to the brain. Motor neurons carry messages away from the brain to the rest of the body.
All neurons relay information to each other through a complex electrochemical process, making connections that affect the way you think, learn, move, and behave.
As you grow and learn, messages travel from one neuron to another over and over, creating connections in the brain. It’s why learning requires high concentration, once learned and the cell connection is not broken it becomes second nature.
for further study use the following links:
https://www.youtube.com/watch?v=qrvK_KuIeJk
Comment by dchaudhuri — November 2, 2024 @ 9:40 pm