> Somewhat over the top comment. ... Still, calling them the bedrock and standard is too much.
Nope, i meant exactly what i said. The point is not popularity/commodity but understanding a domain from the foundations.
I am one of the oldies who started with (in chronological order) MS-DOS, a brief foray into Mainframe COBOL, 16-bit Windows (Windows 3.1) to 32-bit Windows (Windows NT 3.51 & 4), then to Solaris/Linux and all using C/C++. So i had actually programmed in Windows before Unix. There was only Dave Cutler's book on the Windows NT kernel then with everything else being only on programming the GUI subsystem (Charles Petzold and Jeffrey Richter were the notable ones) whereas in the Unix world you had Maurice Bach/Marshall Mckusick/etc. books explaining the kernel and Marc Rochkind/Richard Stevens/etc. explaining how to program it. I still remember reading the 1st edition of Rochkind's book (there were only around a dozen system calls then and the book was less than 200 pages i think) and understanding everything (Unix was a simple monolith then) while Windows was made up of a kernel+various subsystems and it was rather hard to understand what was what. And through it all C was the common "glue" lingua franca which allowed one to program all of them.
The above situation persists to this day in spite of the explosion of languages/OS/Architectures. A knowledge of C/Unix will allow you to understand and program any system be it bare-metal/kernel-level/app-level/system utilities. You may not need it at your job but a knowledge of C/Unix (you don't have to become an expert) will give you a solid bedrock to hang your higher level understanding on. It will also make your transition to other languages/OSes easier since you will have a good understanding of what is going on underneath.
Corrigendum: Consulting Rochkind's 2nd edition of AUP book it says; the 1st edition included about 70 system calls while the 2nd edition includes about 300 gleaned from SUS (https://en.wikipedia.org/wiki/Single_UNIX_Specification).
Nope, i meant exactly what i said. The point is not popularity/commodity but understanding a domain from the foundations.
I am one of the oldies who started with (in chronological order) MS-DOS, a brief foray into Mainframe COBOL, 16-bit Windows (Windows 3.1) to 32-bit Windows (Windows NT 3.51 & 4), then to Solaris/Linux and all using C/C++. So i had actually programmed in Windows before Unix. There was only Dave Cutler's book on the Windows NT kernel then with everything else being only on programming the GUI subsystem (Charles Petzold and Jeffrey Richter were the notable ones) whereas in the Unix world you had Maurice Bach/Marshall Mckusick/etc. books explaining the kernel and Marc Rochkind/Richard Stevens/etc. explaining how to program it. I still remember reading the 1st edition of Rochkind's book (there were only around a dozen system calls then and the book was less than 200 pages i think) and understanding everything (Unix was a simple monolith then) while Windows was made up of a kernel+various subsystems and it was rather hard to understand what was what. And through it all C was the common "glue" lingua franca which allowed one to program all of them.
The above situation persists to this day in spite of the explosion of languages/OS/Architectures. A knowledge of C/Unix will allow you to understand and program any system be it bare-metal/kernel-level/app-level/system utilities. You may not need it at your job but a knowledge of C/Unix (you don't have to become an expert) will give you a solid bedrock to hang your higher level understanding on. It will also make your transition to other languages/OSes easier since you will have a good understanding of what is going on underneath.