Balanced Ternary:
A Computational History
From signed-digit precursors and Fowler's wooden engine
to Setun, BitNet and ternary silicon
Opening
Balanced ternary is the positional number system in base three whose digits are not \(\{0,1,2\}\) but \(\{-1,0,+1\}\). Negation is performed by swapping the positive and negative symbols. Rounding and truncation coincide. There is no separate sign bit, because the leading non-zero trit already carries the sign of the whole number. Donald Knuth called it “perhaps the prettiest number system of all” [1].
That line is often quoted because it is neat and true, but it risks making balanced ternary sound like a beautiful sideshow. Its actual story is much rougher, more practical and far more human than that. It runs from old signed-digit tricks in arithmetic, through a Devonshire inventor building a wooden calculator in the 1830s, through Soviet ternary hardware in the Cold War, through decades of specialist multi-valued logic research, and then into the 2020s, where both AI researchers and hardware designers have started circling back to the same three-state alphabet for brutally modern reasons: memory, power, throughput and complexity.
This is not a timeline of isolated curiosities. It is a long, broken thread that keeps reappearing whenever engineers are forced back to first principles and asked the same awkward question again: what is the smallest, cleanest, most useful alphabet that still does the job?
Why balanced ternary keeps returning
The attraction of balanced ternary is easiest to feel in arithmetic before one ever gets to hardware. The digits are symmetric around zero. Positive and negative values are treated on the same footing. Many carry rules become tidier. Some forms of rounding become less awkward. And for any fixed numerical range, ternary uses fewer digits than binary. That does not automatically make it better in every machine, but it does make it stubbornly hard to forget.
There is also a deeper engineering instinct at work. Binary won because it matched the electronics of the twentieth century extraordinarily well. Two-state devices were easier to build, harder to misread and easier to scale. But binary's practical dominance never erased the old suspicion that three might be closer to the mathematical sweet spot. The best-known form of that argument is radix economy, where the hardware cost of a positional system is often sketched as roughly proportional to radix times word width. Under that simplification, the theoretical optimum lies near \(e\approx 2.718\), which makes 3 the nearest useful integer base [2][3].
That argument alone does not settle the matter either. Real engineering is never that charitable. But it helps explain why ternary keeps reappearing when designers hit limits. The pattern is remarkably consistent. When complexity hurts, when memory hurts, when energy hurts, when carry logic hurts, ternary gets another hearing.
Prehistory and disputed precursors
The earliest roots of balanced ternary are not ternary at all. They are signed-digit habits.
There is a long-running claim that the Vedic vinculum notation preserved the essential idea by allowing a digit to stand for subtraction from the next place value. In decimal, that lets a number be written as a combination of positive and negative digits. Whether that counts as a true ancestor of balanced ternary depends on how generous one wishes to be. It is better treated as a precursor of the signed-digit idea than as balanced ternary proper [4][5].
A similar caution applies to Johannes Kepler. He is sometimes credited with using signed-digit schemes in astronomical work, and Knuth mentions a Kepler connection in his historical notes, but the literature is not tidy enough to present Kepler as the first balanced-ternary thinker with a straight face [1]. The respectable claim is narrower. Signed-digit arithmetic is old. The permission to let a place be negative is older than most modern summaries imply.
In the European tradition, John Colson's 1726 “negativo-affirmative arithmetick” is a more defensible milestone, followed by John Leslie's Philosophy of Arithmetic in 1820, which treats arbitrary radices and signed digits in a notably systematic way [4][6]. Leslie matters because he turns the idea from a trick into a general method. Once that move is made, balanced ternary is no longer an eccentric afterthought. It becomes one instance of a broader arithmetic design space.
1840 and the question of priority
The year 1840 sits at the centre of the story because two different kinds of priority converge there.
In France, Augustin-Louis Cauchy discussed signed-digit representations in arbitrary bases, and Léon Lalanne followed with the paper Knuth later singled out as the first true appearance of “pure” balanced-ternary notation [1][7]. If the question is who first set balanced ternary down on paper in recognisable, explicit form, Lalanne has the cleanest claim.
In Devon, however, Thomas Fowler had gone somewhere else entirely. He had not started from formal notation. He had started from a clerk's life in money, measures and repetitive calculation, and from the practical headache of building a machine that did not turn into a mechanical nightmare. That led him, independently, to the same base.
The Lalanne versus Fowler question is therefore not really a duel unless one insists on flattening two different achievements into one. Lalanne has priority on notation. Fowler has priority on the working machine. The fact that these arrive almost simultaneously, and almost certainly independently, is one of the most striking things in the entire history of ternary computation.
Thomas Fowler and the wooden machine from Devon
Thomas Fowler was born in Great Torrington in 1777 and spent most of his life there. He was not the polished metropolitan inventor of textbook mythology. He was a cooper's son, apprenticed young, partially self-taught, later a printer, bookseller, clerk, accountant, inspector of weights and measures, councillor, inventor and, by every surviving account, a man of unusual stubbornness and range [8][9].
The local detail matters because it explains why he did not think like Babbage. Fowler worked among pounds, shillings, pence, weights, measures and poor-law disbursements. He lived with tables, conversions and tedious repeated arithmetic. He also had carpentry skills, which meant that when he designed a machine, he designed something he might actually build with his own hands. That practical background is exactly what turned balanced ternary from a mathematical possibility into a working engineering choice.
The usual simplified summary is that Fowler chose base 3 because it made the mechanics easier. That is true, but incomplete. He was also alive to the awkwardness of decimal in real calculation and to the usefulness of lower bases for divisible units. He was not trying to produce a universal philosophy of computation. He was trying to make a machine that solved actual work without turning into a small wooden catastrophe.
By 1829 he had already patented the thermosiphon, an invention now so ordinary in principle that it is easy to miss how revealing the episode is. Fowler had watched that patent experience turn bitter, expensive and, in his view, exploitative. That experience later shaped his disastrous refusal to patent or publish the crucial design details of his calculator [8]. The trauma of one invention haunted the fate of the next.
His calculating machine appears to have been operational by 1838 or thereabouts, and it was demonstrated in London in 1840 before a formidable audience including Charles Babbage, Augustus De Morgan, Francis Beaufort, Peter Roget, Sir John Lubbock and others [8][10][11]. De Morgan later wrote the Royal Society description. The machine itself was built in wood. No original example survives. A modern replica by Mark Glusker, based on surviving descriptions, is held in Great Torrington Museum [12][13].
What makes Fowler extraordinary is not merely that he used balanced ternary. It is that he used it because it worked. He discovered, by stubborn practical reasoning, that lower base meant simpler mechanics. That insight is not quaint. It is the same engineering instinct that reappears a century and a half later in AI inference and low-power silicon. Fowler would have recognised the logic immediately, even if the hardware would have looked like sorcery.
How Fowler lost the future
Fowler's machine worked, but working is not the same thing as surviving. He refused to patent the design properly or release the detailed drawings that scientific institutions wanted. From his point of view this was rational self-protection. From history's point of view it was fatal [8].
He had already seen what happened when an invention was exposed to the patent machinery of the day. He trusted neither the process nor the people around it. So, instead of locking his place in the historical record, he tried to protect the machine by keeping too much of it private. The result was the opposite of what he wanted. The design became famous enough to be admired, not famous enough to be reproduced, and then vulnerable enough to be forgotten. It is one of those painfully human turns in the history of technology: the instinct that made him careful also made him vanish.
The contrast with Charles Thomas de Colmar is instructive. De Colmar's arithmometer had a cleaner commercial afterlife because it entered the world of manufacture and sale. Fowler's entered the world of rumour, notes and missed chances. The machine survived as testimony, not as lineage. That is why his name slipped almost entirely out of public computing history for so long, despite later historians arguing that in some respects the machine sits closer to modern digital circuitry than Babbage's better-known engines [13][14].
The long quiet
After Fowler and Lalanne, balanced ternary more or less disappears from mainstream computing history for the better part of a century. It survives in mathematics, recreational treatments and the occasional historical footnote. This quiet matters because it creates the illusion that the idea was tested, rejected and finished with. In truth, it was more neglected than refuted.
That neglect ended just as electronic computing was beginning. Claude Shannon's 1950 paper, “A Symmetrical Notation for Numbers”, reintroduced symmetrical signed-digit systems into modern technical language [15]. At almost exactly the same moment, the 1950 ERA survey High-Speed Computing Devices gave the radix-economy argument for base 3 its most influential early hardware form, and Herbert Grosch was proposing ternary ideas in the orbit of MIT's Whirlwind [2][3].
This is the first time balanced ternary starts to look less like a historical curiosity and more like an unchosen road in mainstream computer architecture. Binary still won, but now one can see that there was a fork.
Setun and the Soviet ternary detour
The most serious answer to the question “what would a ternary computer actually look like?” arrived in Moscow.
Setun was developed in 1958 at Moscow State University under Sergei Sobolev and Nikolay Brusentsov, with Evgeny Zhogolev among the key contributors [16][17]. It used balanced ternary throughout. About fifty machines were built between 1959 and 1965 at the Kazan Mathematical Plant, and around thirty universities plus a number of research institutes and industrial sites used them in practice [16][17].
Setun matters because it was not an exhibition piece. It was a production machine with real users. Brusentsov's retrospective is still striking to read because the claimed advantages are not mystical. They are exactly the things ternary enthusiasts still point to now: simpler programming, less redundancy, cleaner handling of positive and negative values, reduced rounding awkwardness and a favourable parts count for the numerical range involved [16].
Then came Setun-70. By 1970 Brusentsov and Zhogolev had pushed the architecture further, introducing six-trit “trytes” and a design that, in retrospect, looks startlingly modern. Structured programming ideas were baked into the hardware. The instruction set was short and economical. Later commentators have repeatedly remarked that some of its instincts feel almost RISC-like before RISC was named [16][17].
And yet the line did not win. Production was halted, the political winds shifted, and binary standardisation rolled on. This is one of the recurring melancholy notes in the ternary story. Balanced ternary keeps producing machines that are clever, elegant and in some contexts highly practical, and then gets buried by the momentum of a larger ecosystem.
TERNAC, multi-valued logic and the years in the wilderness
In the early 1970s Gideon Frieder, Anatole Fong and C. Y. Chao built TERNAC at SUNY Buffalo, not as dedicated hardware but as a balanced-ternary machine emulated in software on a Burroughs B1700 [18][19]. This was important for two opposite reasons at once. It showed that ternary ideas were still intellectually alive, and it also showed that binary hardware was improving so quickly that clever ternary systems could be hosted on it instead of replacing it.
Frieder later remarked that the project became a victim of its own success. That line deserves to be remembered. Ternary arithmetic and logic were efficient enough in emulation that the case for building special-purpose silicon weakened just as VLSI was making binary machines faster and cheaper anyway [19]. It is a very modern sort of defeat: not being disproved, merely being outpaced.
From the 1970s through the 1990s, balanced ternary lived mostly inside the broader multi-valued logic research world: ternary adders and multipliers, Josephson-junction proposals, optical three-state schemes, specialist logic families and a long bibliography of “what if?” papers [20]. It never entirely died. It just moved off centre stage.
Emulators, enthusiasts and the preservation years
By the late 2000s balanced ternary had settled into a familiar pattern. Industry largely ignored it. Researchers revisited it in specialised pockets. Enthusiasts kept it alive.
The Tunguska balanced-ternary virtual machine, begun around 2008, is a good example. It gave experimenters a clean modern environment in which ternary ideas could be tried without waiting for hardware miracles [21]. Setun-70 emulators and documentation projects served a similar role. This was not the glamour phase of the story, but it mattered more than it looked. Without these preservation years, the current revival would feel like spontaneous novelty instead of what it really is, which is a return.
The modern renaissance
The reason balanced ternary feels suddenly current again in the 2020s is that two very different engineering cultures have reached for the same alphabet under pressure.
Silicon
In 2019, researchers at UNIST demonstrated wafer-scale unbalanced ternary semiconductor work. That was not balanced ternary, but it mattered because it reopened the question of three-state logic in serious semiconductor contexts [22].
Then came Huawei's patent application CN119652311A, filed in September 2023 and made public in 2025, explicitly describing ternary logic gate circuits using the balanced encoding \(\{-1,0,+1\}\) [23]. Whether the most breathless public performance claims attached to it will survive industrial reality is another question. The important fact is simpler: a major company has now put balanced ternary back on the table as a practical design direction for chips under modern economic and geopolitical pressure. Even if the first generation turns out to be imperfect, the taboo has plainly been broken.
AI models
At almost the same time, large language model research started heading towards the same place for different reasons. Microsoft's 2023 BitNet paper introduced transformer layers constrained to very low-bit weights [24]. The real conceptual jump came in 2024 with BitNet b1.58, where the weight alphabet becomes \(\{-1,0,+1\}\), that is, balanced ternary in all but name [25]. The “1.58” is simply \(\log_2 3\approx1.585\) bits per weight.
This is where the old Fowler intuition suddenly starts sounding uncannily modern. In BitNet, multiplication by a weight collapses to one of three actions: add, subtract, or do nothing. The arithmetic simplification is not decorative. It is the whole point. Less memory traffic. Less energy. Smaller models. Cheaper inference. In April 2025 Microsoft released the open-weight BitNet b1.58 2B4T model and the accompanying bitnet.cpp inference framework, making the approach concrete enough to test in the open rather than admire at a distance [26][27]. What had once been a beautiful number-system argument was now being measured in watts, tokens and memory footprint.
Hardware returns
In 2026, Claudio Lorenzo La Rosa's 5500FP put another marker down: a 24-trit balanced-ternary RISC processor implemented on FPGA, with a real instruction set and a development-board story attached to it [28]. It is not a mass-market revolution, but it is exactly the sort of thing that changes the tone of a field. Ternary hardware is no longer merely historical, hypothetical or hidden in academic papers. It is once again something one can actually build and touch.
What the present moment actually means
It is tempting to tell this as a story of vindication. Balanced ternary was always right, the world was too slow to notice, and now at last history has caught up. That version is satisfying but not quite honest.
Binary won for very good reasons. It still wins in most of mainstream computing for very good reasons. The present moment does not prove that base 3 should have replaced base 2 everywhere. What it does show is that balanced ternary occupies a recurring optimum under certain pressures. When devices are cheap but energy is not, when parameters are huge but memory is not, when arithmetic overhead dominates, or when one wants signed values without a clumsy sign layer, balanced ternary ceases to be pretty and starts to be useful.
That is why the modern convergence is so interesting. Microsoft's BitNet work approaches the problem from AI inference and training efficiency. Huawei's patent approaches it from chip logic density and power. La Rosa approaches it from open architectural experimentation. These communities are not sharing a fashion. They are reacting to different bottlenecks and arriving at the same three-state compromise. When that happens in engineering, one pays attention.
A historical lesson and a roadmap
The value of this history is not merely antiquarian. It provides a roadmap.
First, it shows that balanced ternary is strongest when it is treated as an engineering choice rather than as a metaphysical slogan. Fowler chose it because lower base simplified the mechanics. Brusentsov chose it because it improved programming and machine economy in a specific architectural context. BitNet chooses it because it simplifies the arithmetic kernels of modern models. The pattern is always the same. It wins where it removes awkwardness from the real system in front of you.
Second, it shows how easily a good idea can disappear if it fails to secure a living implementation culture. Fowler lost the drawings. Setun lost the industrial war. Multi-valued logic spent decades in the margins. The present revival will matter only if it produces tools, models, boards, compilers and experiments that other people can actually use.
Third, it suggests that balanced ternary is not merely an alternative notation for existing binary thought. It often changes the shape of the problem. Once the alphabet is \(\{-1,0,+1\}\) instead of \(\{0,1\}\), subtraction is no longer an add-on, neutrality is explicit, and absence becomes first-class rather than encoded indirectly. That is exactly why it remains attractive in places where symmetry, cancellation, sparsity and signed structure matter.
The modern opportunity, then, is not to replay a Victorian argument in silicon cosplay. It is to ask, very practically, where balanced ternary is the right substrate for the job now. AI inference is one answer. Experimental hardware is another. Field-based simulation and lattice-native computation are another. History does not prove the future, but it does narrow the range of foolish mistakes.
Closing
The path from Thomas Fowler's workshop in Great Torrington to a native ternary language model running on a modern CPU is absurdly long and oddly coherent. Fowler discovered that lower base meant simpler mechanics. BitNet discovers that lower precision and a signed ternary alphabet mean simpler kernels. The centuries in between are full of false starts, lost machines, Cold War detours and specialist papers, but the central engineering instinct barely changes. Strip away the eras, the accents and the materials, and the argument keeps coming back in the same form.
If Knuth's old “flip-flap-flop” line is finally starting to look less like a joke and more like a delayed forecast, the delay has been extraordinary. Balanced ternary did not fail once. It kept returning whenever the constraints became honest enough. That is why its history matters now. It is not the history of a dead branch. It is the history of an idea that keeps being rediscovered by people who think they are solving today's problem and then find, to their mild annoyance, that someone in another century got there first.
Cited References
- D. E. Knuth, The Art of Computer Programming, Volume 2: Seminumerical Algorithms, 2nd and 3rd editions, Addison-Wesley.
- Engineering Research Associates, High-Speed Computing Devices, Office of Naval Research, 1950.
- B. Hayes, “Third Base”, American Scientist, 2001.
- J. Leslie, The Philosophy of Arithmetic, 1820.
- “Main events in the history of balanced ternary number system”, ternary.3neko.ru.
- J. Colson, “Negativo-Affirmative Arithmetick”, 1726.
- L. Lalanne, Comptes Rendus, vol. 11, 1840, pp. 903–905.
- C. White, “How to Build a Wooden Computer: The extraordinary life and times of Thomas Fowler of Great Torrington, Devon, UK”.
- P. Vass, M. Glusker and D. Hogan, The Power of Three: Thomas Fowler, Devon's Forgotten Genius, Boundstone Books, 2022.
- A. De Morgan, “Description of a calculating machine invented by Mr. Thomas Fowler, of Torrington in Devonshire”, Abstracts of the Papers Printed in the Philosophical Transactions of the Royal Society of London, 1843.
- H. Fowler, “Biographical Notice of the late Mr Thomas Fowler of Torrington with some account of his inventions”, 1875.
- M. Glusker, “The Ternary Calculating Machine of Thomas Fowler”, 2005.
- D. Swade, The Difference Engine: Charles Babbage and the Quest to Build the First Computer, 2001.
- P. Vass, M. Glusker and D. Hogan, “The ternary calculating machine of Thomas Fowler”, IEEE Annals of the History of Computing, 2005.
- C. E. Shannon, “A Symmetrical Notation for Numbers”, American Mathematical Monthly, 57(2), 1950.
- N. P. Brusentsov and J. Ramil Alvarez, “Ternary Computers: The Setun and the Setun 70”, in Perspectives on Soviet and Russian Computing, IFIP AICT 357, Springer, 2011.
- N. P. Brusentsov, “Development of ternary computers at Moscow State University”, Russian Virtual Computer Museum.
- G. Frieder, A. Fong and C. Y. Chao, “A Balanced Ternary Computer”, SUNY Buffalo, 1972.
- G. Frieder and C. Luk, “Ternary computers: part 2: emulation of a ternary computer”, MICRO 5, 1972.
- General multi-valued logic surveys and ternary-computer bibliographies from the 1970s to the 1990s, including Josephson-junction and optical ternary proposals.
- Tunguska balanced-ternary virtual machine project documentation.
- K.-r. Kim and collaborators, wafer-scale ternary semiconductor work at UNIST, 2019.
- Huawei Technologies Co. Ltd., “Ternary logic gate circuit, computing circuit, chip and electronic device”, patent application CN119652311A, filed 2023, published 2025.
- H. Wang and collaborators, “BitNet: Scaling 1-bit Transformers for Large Language Models”, 2023.
- S. Ma, H. Wang and collaborators, “The Era of 1-bit LLMs: All Large Language Models are in 1.58 Bits”, 2024.
- Microsoft,
bitnet-b1.58-2B-4Tmodel release, 2025. - Microsoft,
bitnet.cppinference framework, 2025. - C. L. La Rosa, “5500FP: A 24-Trit Balanced Ternary RISC Processor”, 2026.