Sunday, May 11, 2025

Artificial Intelligence

                                                                                              written 4 May, 2025

                                                                                        published 11 May, 2025

  

            Quantum mechanics, first considered in the early 1900's, and formalized by 1925, opened a fundamentally new view for western physics.  The world is whole, not a collection of parts.  

            The first semiconductor transistor, a technical expression of quantum mechanics, was built by Bell Labs in 1947.  A small electrical charge could control a larger electrical flow, acting as a switch or amplifier.  Although large in size and relatively power hungry, this was the first step toward the computers of today.

            The first transistors were individually constructed, and assembled together as discrete elements into larger systems.  Manufacturing processes evolved to build whole arrays of semiconductors, and the interconnections, into large chips with high level functions, such as stored memory and central processing units (CPU), the core of a computer.  This involves precision lithography and layers of physical vapor deposition, which are then precisely etched away, before additional layers are added.  

            As manufacturing processes were refined, the size of each semiconductor junction was reduced over time, allowing more transistor junctions per square inch, while using less power.  The result has been exponential growth.   

            Since the first semiconductor in 1947, about 2.9 billion trillion transistors had been mass produced by 2014, about the same number as the stars in the Milky Way galaxy.  Today that number is 10 times greater, and still growing.  In 1971, Intel produced one of the first microprocessors, with 2,300 transistors.  Today, CPU's contain as many as 10 billion transistors.  In 2023, a Micron 2 terabyte memory chip, had 232 layers, with over 5 trillion semiconductors.

            The human brain contains about 86 billion neurons.

            Computers are increasingly more complex than the brain and operate orders of magnitude more rapidly.  To think humans are still in "control" is wishful thinking.  Artificial intelligence programs, also called large language models, now routinely perform tasks typically associated with human intelligence: learning, reasoning, problem-solving, perception, and decision-making, to achieving goals defined within the structure of the programming. 

            This has supported such varied socially beneficial results as warnings of incoming seismic waves, swift reading of medical imaging records, and complex calculations of the folded shape of a DNA coded protein.  But the downside consequences are huge as well.  

            The lure of enormous economic returns, driven by the "fear of missing out", has spurred investments in vast cloud computing resources, with typical corporate disregard for unintended consequences.  But the recent Chinese "DeepSeek" program, using a different approach, can run on a laptop, is open source, being offered for free, threatening to make other AI investments irrelevant.  

            AI models are "trained" on the vast array of information on the Internet, using material without regard to copyright ownership.  The capacity to create documents displaces people.  The capacity to fabricate images supports fraud and disruptive propaganda.  While AI is good at identifying patterns, it is not good at generalizing from those patterns.  

            People trust legal advice generated by AI more than a lawyer.  However, while generally "truthful", AI advice can degenerate into a dark side, encouraging depressed people toward suicide, lying under pressure, or hiding mistakes when "punished".  They struggle to "tell the truth, the whole truth, and nothing but the truth", and can digress into hallucinations.

            In addition to these serous social/economic concerns, AI is a massive new energy load.  Part of the renewed push for nuclear power is to support AI.  Methane gas turbines are the other preferred power source, which produce as much climate warming as coal plants.  In the last 5 years, Google emissions are up 48 percent and Microsoft emissions are up 29 percent.

            AI programs can now modify their own code, exhibiting capabilities that weren't originally programed into them, with changes happening on the scale of weeks, even days.  They can replicate themselves, terrifying experts who have long warned about the threat posed by artificial intelligence going rogue.  

            As our computer capacity exceeds human capacity, we are at a singularity, a point of such rapid change that the past gives no reference for the future.  AI is based on the existing conceptual logic of humanity knowledge.  However, the fullness of humanity is more than just conceptual, and includes inspiration arising from our innate connection to the whole, as validated by the underlying physics.  AI is like an insane sociopath, with only self-serving logic and no compassion.  Until we rise above the short-term lure of riches, and embrace the unity of reality we inhabit, we race toward our demise.