this post was submitted on 22 Nov 2024
15 points (100.0% liked)

Science

13038 readers
1 users here now

Studies, research findings, and interesting tidbits from the ever-expanding scientific world.

Subcommunities on Beehaw:


Be sure to also check out these other Fediverse science communities:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

How Base 3 Computing Beats Binary

Metadata

Highlights

Three, as Schoolhouse Rock! told children of the 1970s, is a magic number. Three little pigs; three beds, bowls and bears for Goldilocks; three Star Wars trilogies. You need at least three legs for a stool to stand on its own, and at least three points to define a triangle.

If a three-state system is so efficient, you might imagine that a four-state or five-state system would be even more so. But the more digits you require, the more space you’ll need. It turns out that ternary is the most economical of all possible integer bases for representing big numbers.

Surprisingly, if you allow a base to be any real number, and not just an integer, then the most efficient computational base is the irrational number e.

Despite its natural advantages, base 3 computing never took off, even though many mathematicians marveled at its efficiency. In 1840, an English printer, inventor, banker and self-taught mathematician named Thomas Fowler invented a ternary computing machine to calculate weighted values of taxes and interest. “After that, very little was done for years,” said Bertrand Cambou, an applied physicist at Northern Arizona University.

Why didn’t ternary computing catch on? The primary reason was convention. Even though Soviet scientists were building ternary devices, the rest of the world focused on developing hardware and software based on switching circuits — the foundation of binary computing. Binary was easier to implement.

top 1 comments
sorted by: hot top controversial new old
[–] solanaceous 4 points 1 week ago

It really mostly doesn't, and Quanta Magazine is (as is typical for them) full of sh*t.

Ternary is most efficient if the space (power, etc) needed to implement an operation on a base-b digit is proportional to b. (Then the cost is b * log(n) / log(b), and b/log b is minimized at e, but is lower with b=3 than with b=2.) However, in practice most operations take space that increases more than proportionally to b. For example, saturated transistors are either on or off, which is enough to implement binary logic, but ternary logic needs typically several more transistors. Transistors, and especially CMOS style implementations, are generally well-suited to binary. If future computers use a different implementation style (neurons! who knows) then something other than binary logic might be best.

Storing and transmitting data is different: this is often most efficient in bases other than 2. For example, if a flash cell of a certain size can reliably store 4 different amounts of charge, and the difference between these can reliably be read out, then flash manufacturers will store two bits per cell. This is already done and has been done for years. It's most often done in bases that are powers of 2, but not always.

Ternary calculations are occasionally used in cryptography, but as far as I can tell, at least the first ternary crypto paper the article cites is garbage.

There are also other architectures like clockless logic, which uses a third value for "not done calculating yet", but that's different from ordinary ternary logic (and is generally implemented using binary anyway). It also showed a lot of promise for saving power, and also some in reducing interference, but in most settings the increased complexity and circuit size required have been too much to deliver that savings.