Get this book -> Problems on Array: For Interviews and Competitive Programming
The law is named after Douglas Engelbart, whose work in augmenting human performance was explicitly based on the realization that although we make use of technology, the ability to improve on improvements (getting better at getting better) resides entirely within the human sphere.
Haitz's law
Haitz's law claims that every decade, the cost per lumen (unit of useful light emitted) falls by a factor of 10, and the amount of light generated per LED package increases by a factor of 20, for a given wavelength (color) of light.
Koomey's law
Koomey's law states that the number of computations per joule of energy dissipated has been doubling approximately every 1.57 years.
This trend has been stable since the 1950s and has been faster than Moore's law. Jonathan Koomey reframed the trend as follows: "at a fixed computing load, the amount of battery you need will fall by a factor of two every year and a half".
Carlson curve
Carlson curve is the biotechnological equivalent of Moore's law and is named after author Rob Carlson. It predicted that the doubling time of DNA sequencing technologies (measured by cost and performance) would be at least as fast as Moore's law.
Carlson Curves illustrate the rapid (in some cases hyperexponential) decreases in cost, and increases in performance of a variety of technologies, including DNA sequencing, DNA synthesis, and a range of physical and computational tools used in protein expression and in determining protein structures.
Nielsen's Law
Nielsen's Law says that the bandwidth available to users increases by 50% annually.
Swanson's law
Swanson's law states that the price of solar photovoltaic modules tends to drop 20 percent for every doubling of cumulative shipped volume. At present rates, costs halve about every 10 years.
It is named after Richard Swanson, the founder of SunPower Corporation, a solar panel manufacturer.
Keck's law
Keck's law states that the number of bits per second that can be sent down an optical fiber increases exponentially, faster than Moore's law.
Pollack's rule
Pollack's Rule states that microprocessor performance increase due to microarchitecture advances is roughly proportional to the square root of the increase in complexity. This contrasts with power consumption increase, which is roughly linearly proportional to the increase in complexity. Complexity refers to processor logic that is its area.
The rule is an industry term and is named for Fred Pollack, a lead engineer and fellow at Intel.
Dennard Scaling
Dennard scaling states that power requirements are proportional to area (both voltage and current being proportional to length) for transistors.
Combined with Moore's law, performance per watt would grow at roughly the same rate as transistor density, doubling every 1–2 years.
According to Dennard scaling, transistor dimensions are scaled by 30% every technology generation, thus reducing their area by 50%. This reduces the delay by 30% and therefore increases operating frequency by about 40%. Finally, to keep electric field constant, voltage is reduced by 30%, reducing energy by 65% and power by 50%.
In every technology generation, transistor density doubles, circuit becomes 40% faster while the overall power consumption stays the same.
Wirth's law / Page's law / Gates's law / May's law / The great Moore's law compensator (TGMLC)
All these laws are variants of the same laws regarding software efficiency.
Wirth's law states that software is getting slower more rapidly than hardware becomes faster. It is named after Niklaus Wirth, who discussed it in his 1995 paper, "A Plea for Lean Software".
Wirth's law is, also, refered to as Page's Law which is named after Larry Page, founder of Google.
Gates' Law is an observation that the speed of commercial software generally slows by 50% every 18 months, thereby negating all the benefits of Moore's law. It is named after Bill Gates, founder of Microsoft.
May's Law states that software efficiency halves every 18 months, compensating Moore's law. It is named after David May, a British Computer Scientist.