"Good times" I programmed one or two minor subroutines by direct "poke"s of binary digits into specific memory locations... under native basic on ancient Sinclair zx80.
The language being used was that basic zx80 "Basic", the poke was a command of it to allow introducing binary digits at specific memory addresses till filling it with the desired values. Those bytes could be just storage data or could "form" executable code when the location was called from the Basic and given the control.
So a series of binary digits, under that concept, can be seen as a language but one need be careful to understand is being meant the "string", like in this random example "010010111001" and as such is loosely referred as binary language but by no mean confused with the actual bits in the computer memory waiting to be read, written or executed. Notice even in the above example the strings to be used by the poke command was written as hex. not binary.
As pointed in former post, the digits 0 and 1 are all the "alphabet" the computer actually understand, so whatever actual language used to write the code, it must transform all and every instruction in sequences of those digits in a specific construction the machine, "to what the original code was compiled", understand as specific instruction to perform specific action. This is totally dependent on the processor architecture and so the portability is null. To make things worse, the binary sequences although possible to be directly generated and introduced into the computer... are A LOT hard to human understanding... so...
Someone thought to associate the basic processor's functions in mnemonics easier to understand but still tied one by one with the processor's code table. And then the Assembly was born as the first (in fact) language for using with computers.
Although a lot more efficient (from the human point of view) than directly "poking" bytes into the computer's memory... the Assembly was still hard to program and kept the main issue of being not portable. We humans needed something more friendly... a way to "say" to the computer something in a more natural way and the compiling program would take those simple commands and transform then in sometimes huge amount of specific instructions.
The new level of language became more independent on the machine architecture and the same code could be used to feed compilers for different machines with minimal changes or none at all. Many these languages first translated their codes to assembly and from the assembler made into binary digits to be in the last instance poked into the machine like was in the beginning. For this reason and for abstracting the machine architecture these languages came to be known as high level languages.
Oops, sorry, the post is already bigger than I hoped to make it (let alone, off topic)... see ya
PS: A title of curiosity, notice how I use "Assembly" to indicate the language and "assembler" as the actual program to make... well... the assembling of the code into binary digits.
Trace the parallel to the difference between, let's say, the C++ language and the C++ compiler.