How Do MIDI design

Now dealing with the low nibble part of the first byte(the midi channel) is pretty simple, we simply have the chip initialized beforehand to only care about one specific channel, else the message gets ignored and we wait for another message to come.

I get that part, its the source code of the microprocessor to ignore the other low nibble that is confusing me as I have never come across that type of code when working with microcontrollers nor have I found any information about it in my books (that I know of).
 
I don't think so, Interrupts are used to stop a current routine or subroutine so that an interrupt routine can take place. This is used when you need to allocate ALU resources and prevent routines from "polling". From my understanding, with 8 bits being a byte, and groups usually of 2 or 3 being "words" are separated by spaces. In machine language the space or separation to a new word or new piece of information is one universal binary value signifying the next byte is the first byte of a new word. If you where programming in C to get the selectivity of channels you would use the "if" and "else" functions where "if" would have the code that states if <P1.0> = B#h (# being the channel you want to work on and P1.0 being the port you are receiving from) then move to the next byte and on the "else", have the code <P1.0> = whatever the binary value for the break is and then return back to the top of the subroutine so that the code can go back to the original if statement so it can process the next word.