- Thread Author
- #1

According to Gordon Moore's observation, one of the founders of Intel, the number of transistors on a chip approximately doubled every two years. However, according to scientists, this "natural law" has now reached its physical limits.
For about half a century, computer technology advanced with almost astonishing regularity. Transistors became smaller and smaller, chips became faster, and computational power multiplied every few years.
This silent but steady progress transformed countless fields, from more accurate weather forecasts to advanced scientific simulations, from realistic graphics to artificial intelligence.
This regular increase was known as Moore's Law.
THE PROBLEM: "LIMITS OF PHYSICS"
Associate Professor Dr. Domenico Vicinanza, an expert in intelligent systems and data science at Anglia Ruskin University, emphasizes that the end of Moore's Law should not be confused with technological stagnation. The main issue is that the physical assumptions that made it possible to shrink transistors further are no longer valid.
At this point, the question changes:
"What will replace automatic speed increases?"
The answer is not a single revolution, but a combination of multiple strategies.
KEY APPROACHES OF THE NEW ERA
1. Smarter transistors and new materials
Engineers are trying to improve transistor designs to reduce energy loss and electrical leakage. While these developments may not create as big leaps as before, they play a critical role in maintaining energy efficiency.
2. The physical architecture of chips is changing
Not everything is placed on a single flat surface anymore. Modern chips shorten data transfer distances by stacking components on top of each other or positioning them closer together. This saves both speed and energy.
3. The biggest breakthrough: Specialization
Perhaps the most important feature of the new era is the end of the "one processor for all tasks" approach:
CPUs handle decision-making and control tasks.
GPUs excel in graphics and artificial intelligence tasks requiring parallel computing.
AI accelerators focus on performing many simple operations simultaneously.
Performance now depends not on the speed of a single chip, but on how harmoniously these components work together.
While areas such as artificial intelligence, medical diagnosis, navigation, and complex modeling continue to advance rapidly, general-purpose performance improvements may feel slower.
"PERFORMANCE IS NO LONGER FREE"
According to experts, the end of Moore's Law is forcing the world of computing to a more honest point. Performance no longer comes automatically; it must be designed, justified, and paid for with energy, complexity, and cost.


















