2

I was wondering how computer would look if trinary logic was used. It seems like the bigger the base , the more memory can be utilized. I'll explain. Binary address with length of 32 -> allows you to represent 2^32 possible values. Trinary address -> 3^32 , which is ~ 431439 bigger than the binary one .

It seems like it is much better. Also , the hardware way of doing it could be done easly -> 2 means strong current, 1 means weak current , and 0 no current. Of course it is much more complicated, but the idea is simple. However , i couldnt find any reffrence to any new research or new computer using this kind of logic.

So , my question is why not using 3 numbers logic ? or any n -number logic ( n>2 ) ? What is stopping us from doing that ?

RanZilber
  • 1,840
  • 4
  • 31
  • 42
  • 1
    Think of the code rewrite. That would be insane! – beatgammit Aug 07 '11 at 12:35
  • @tjameson -I dont think you'll need it . Backward compitabilty could be easly implemented because of the fact that every word that can be written in binary , can be also written in trinary. – RanZilber Aug 07 '11 at 12:37
  • Yeah. You could also use no current, very very weak current, very weak current, moderate current etc etc and make a decimal computer. That would really be handy, although the babylonians would prefer a base 60 computer, which is of course insane. – GolezTrol Aug 07 '11 at 12:38
  • 1
    Well, some code depends on integer overflow, and if integers are bigger, then they'll take longer to overflow. Compilers would have to be rewritten to take advantage of the new definition and you'd get weird errors. – beatgammit Aug 07 '11 at 12:39
  • 5
    Say you have three values, what do you have for booleans, true, false, maybe? – beatgammit Aug 07 '11 at 12:40
  • @tjameson - true , I didnt thought about it . But it can be solved either in compiler level or in the same way 32 bit programs run on 64 bit computers . The programs thinks it is using 32 bit memory , while it doesnt. The same can be done here, either at compiler level , or in a lower level. – RanZilber Aug 07 '11 at 12:42
  • 2
    True, False and null. Of course they wouldn't be booleans, but trillians. – GolezTrol Aug 07 '11 at 12:45
  • 1
    I don't know why this is not constructive, but it is a dupe: http://stackoverflow.com/q/764439/240633 – ergosys Mar 17 '12 at 04:55
  • Using this logic we should go back to analog computers....."infinite" bit computing! – DarthRubik Oct 06 '16 at 12:35

3 Answers3

9

These already exist. In fact one of the first computers used ternary logic and indeed, Knuth believes that due to their efficiency and elegance we will eventually move back to using them.

Wolfwyrd
  • 15,716
  • 5
  • 47
  • 67
4

I'm surprised you didn't find anything on this in computer architecture/digital logic books! It is possible to do trinary or polynary logic on chips - the question is not so much about the logic but more about electrical threshold calculations

An on/off (1/0) is not purely off when it's a 0 it's a threshold value - i.e., anything below this voltage level should be considered off and anything above it as on. Now YOU come along and say let's go trinary - the transistors now start feeling the pressure. They are supposed to be much more accurate i.e., i.e, have multiple thresholds to get you what you want and must be fine tuned so that these thresholds/boundaries are better obeyed.

Let's assume you got the thresholds out of the way, you have the problem of the human mind :) What do you like better:

1100110011 or 1122110022

I prefer the former, but maybe that's just me. Ternary logic systems DO exist! In fact quantum computing takes it a leap further with multiple states!!

The thing is you CAN do it, the question is, is it worth it? Going by evidence, binary dominates and definitely seems worth it!

PhD
  • 11,202
  • 14
  • 64
  • 112
  • But quantum computers can have multiple states at once, as long as you don't actually try to read them. – GolezTrol Aug 07 '11 at 12:47
0

At their base, computers use switches, which have two states. On and Off. When dealing with electronic current, at the most basic level, those are your two options. While in theory you probably could have multiple amounts of electricity count as different bits, it would be complicated.

This book, Code, by Charles Petzold, explains how computers work, from the ground up all the way through building a basic processor unit. I think that you'll have a lot to gain by giving it a read.

Moshe
  • 57,511
  • 78
  • 272
  • 425