Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Minimilst CPU's - one instruction !

Options
  • 24-03-2005 7:12pm
    #1
    Moderators, Recreation & Hobbies Moderators, Science, Health & Environment Moderators, Technology & Internet Moderators Posts: 91,687 Mod ✭✭✭✭


    I was looking for a CPU with about 16 instructions and found this

    http://www.safalra.com/programming/misc/about.html
    MISC is RISC taken to the extreme, with only one instruction - 'subtract and branch if negative'. It has the minimal number of instructions logically possible - one. The instruction is 'subtract and branch if negative', and it can be shown that a MISC can perform any calculation a normal RISC (Reduced Instruction Set Computing) computer or CISC (Complex Instruction Set Computing) computer could do.

    Having only one instruction makes it easy to verify mathematically so should be safe in use, not much chance of a pentium division error.

    Would make desining a compiler interesting to say the least.

    Does any one know a CPU or language with a minimal set of instructions ?


Comments

  • Registered Users Posts: 4,003 ✭✭✭rsynnott


    Turing machine :)


  • Closed Accounts Posts: 17,208 ✭✭✭✭aidan_walsh


    Brain*uck. Only eight instructions, and a very approporiate name.


  • Registered Users Posts: 4,003 ✭✭✭rsynnott


    Brain*uck. Only eight instructions, and a very approporiate name.

    Ah, never acutally bothered to read the spec before. It isn't really all that bad! Beats Intercal any day...


  • Closed Accounts Posts: 1,567 ✭✭✭Martyr


    I remember you posted something about viruses that would run on embedded hardware chips.
    Looking into it, its not far-fetched.

    This is from microchip.com
    SELF PROGRAMMING

    Microchip's PIC16F87X family features self programming capability. Self programming enables remote upgrades to the Flash program memory and the end equipment through a variety of medium ranging from Internet and modem to RF and infrared. To setup for self programming, the designer programs a simple boot loader algorithm in a code-protected area of the Flash program memory. Through the selected medium, a secure command allows entry into the PIC16F87X microcontroller through the USART, I²C™ or SPI™ serial communication ports. The boot loader is then enabled to reprogram the PIC16F87X Flash program memory with data received over the desired medium. Self programming is accomplished without the need for external components and without limitations on the PIC16F87X's operating speed or voltage.

    How practical it is in network environment is open to debate.
    I don't know enough about these chips, but i read about the 18F has about 72 instructions whereas the 16F has roughly 30.

    Usually these chips are for portable devices, but maybe they are used in peripheral hardware too.

    Does anyone know what kind of micro controllers most corporations use in mobile devices?


  • Registered Users Posts: 1,391 ✭✭✭fatherdougalmag


    Having only one instruction makes it easy to verify mathematically
    As long as you're only proving that one instruction :)

    As soon as you start making complex operations such as addition, things aren't so rosey.

    The best you could do is something that does OR,AND and XOR operations. Thereafter anything's possible. But never assume simplicity. Sometimes even simplicity needs to be discovered and isn't given.


  • Advertisement
  • Moderators, Recreation & Hobbies Moderators, Science, Health & Environment Moderators, Technology & Internet Moderators Posts: 91,687 Mod ✭✭✭✭Capt'n Midnight


    As long as you're only proving that one instruction :)
    If you get really lucky you might be able to prove by induction or exhaustion. All you want to be able to prove is the CPU does produce the correct output for every combination of input. This isn't possible in today's PC CPU's because of the number of instructions, that's why the pentium bug took so long to discover, and why Airbuses used 80186's instead of faster 286/386/486 etc.

    You can also get mathematically provable computer languages but AFAIK no major project has been done with one, but this would sound like the ideal chip to run it on.


Advertisement