[deleted]
#21
(05-25-2012, 10:33 AM)Alexander Moore Wrote: So, ultimately, was language do all machines understand? Just binary or something?

Binary isnt a language per-sé, it's a representation of a language, but yes this is what all computers understand, the problem being no binary is the same for 2 platforms. x86 binary is completely different from MIPS binary, which is why we have emulators, they convert the MIPS binary in to x86 binary (essentially), adding any additional things we need to do that x86 processors do not support directly.
[Image: ref_sig_anim.gif]
Like our Facebook Page and visit our Facebook Group!
Reply

Sponsored links

#22
[deleted]
Reply
#23
The stuff sits in computer memory of course Wink
Reply
#24
binary is just a list of on/off controls for transistors, so it tells it what transistors to put voltage through and you get a big reaction from that. I don't know EXACTLY how it works, but computers are just huge powerhouses full of electricity "gates" that in the end cause certain combinations to make pretty pictures on the screen xD
[Image: ref_sig_anim.gif]
Like our Facebook Page and visit our Facebook Group!
Reply
#25
[deleted]
Reply
#26
Think of binary not as a language that all machines understand, but more like the alphabet. The 1s and 0s stay the same, but the meaning of structures formed by those 1s and 0s is different across different architectures, just like the same group of letters can mean very different things across different human languages.

As for what's actually physically happening, that's a very complex subject really. Your teacher was quite right, you likely won't understand this sufficiently without actually studying it. I know I don't, certainly not enough to try and explain it even roughly in very basic terms myself. Wink
Reply
#27
[deleted]
Reply
#28
Nobody codes in binary (no sane person anyway Tongue2). There is, however, sort of an inbetween between binary code and common high-level languages, which is called assembly. Assembly languages are already very specific to their respective architecture and programs written in assembly are not portable between them, but allow for a higher degree of optimization and generally result in the program performing better.
A good example was zsnes and snes9x back in the day. zsnes was mostly written in assembly and outperformed snes9x easily. That doesn't really matter anymore these days, but I remember my machine back then choking sometimes with snes9x. Good times Smile

As for Java. Java is popular because it's very easy to write cross-platform applications with it. This because the code is compiled not for the actual target platform, but into code that is intended to be interpreted by the Java Virtual Machine (it also does various other things). This in turn means that on any OS and architecture where the JVM runs on, your program will run on, without you having to worry about specifically targeting different platforms.
It's also a very high-level language, so you don't have worry as much about lower-level workings (proper memory management for example).

Of course, all of that adds up to a significant amount of overhead, and Java is generally quite slow. As a general rule, lower-level languages allow for better performance and a higher degree of optimization, but this also increases coding complexity, leading to more potential errors and bugs and increased development time. Time = money, and you know the rest Wink
I doubt it will die out anytime soon, despite it's various shortcomings.
Reply
#29
[deleted]
Reply
#30
"Good times" I programmed one or two minor subroutines by direct "poke"s of binary digits into specific memory locations... under native basic on ancient Sinclair zx80.

The language being used was that basic zx80 "Basic", the poke was a command of it to allow introducing binary digits at specific memory addresses till filling it with the desired values. Those bytes could be just storage data or could "form" executable code when the location was called from the Basic and given the control.

So a series of binary digits, under that concept, can be seen as a language but one need be careful to understand is being meant the "string", like in this random example "010010111001" and as such is loosely referred as binary language but by no mean confused with the actual bits in the computer memory waiting to be read, written or executed. Notice even in the above example the strings to be used by the poke command was written as hex. not binary.

As pointed in former post, the digits 0 and 1 are all the "alphabet" the computer actually understand, so whatever actual language used to write the code, it must transform all and every instruction in sequences of those digits in a specific construction the machine, "to what the original code was compiled", understand as specific instruction to perform specific action. This is totally dependent on the processor architecture and so the portability is null. To make things worse, the binary sequences although possible to be directly generated and introduced into the computer... are A LOT hard to human understanding... so...

Someone thought to associate the basic processor's functions in mnemonics easier to understand but still tied one by one with the processor's code table. And then the Assembly was born as the first (in fact) language for using with computers.

Although a lot more efficient (from the human point of view) than directly "poking" bytes into the computer's memory... the Assembly was still hard to program and kept the main issue of being not portable. We humans needed something more friendly... a way to "say" to the computer something in a more natural way and the compiling program would take those simple commands and transform then in sometimes huge amount of specific instructions.

The new level of language became more independent on the machine architecture and the same code could be used to feed compilers for different machines with minimal changes or none at all. Many these languages first translated their codes to assembly and from the assembler made into binary digits to be in the last instance poked into the machine like was in the beginning. For this reason and for abstracting the machine architecture these languages came to be known as high level languages.

Oops, sorry, the post is already bigger than I hoped to make it (let alone, off topic)... see ya Smile

PS: A title of curiosity, notice how I use "Assembly" to indicate the language and "assembler" as the actual program to make... well... the assembling of the code into binary digits.
Trace the parallel to the difference between, let's say, the C++ language and the C++ compiler.
Imagination is where we are truly real
Reply




Users browsing this thread: 1 Guest(s)