I can summarize my path. Ages 12-20, wrote code every day. Took a QA job at a small software company. Moved to development team. Spent 6 years there. Used that experience to get into Microsoft. 5 years there. At that point, there were almost no companies that wouldn’t interview me, and it was just on me to do well in the interviews. I have practiced white boarding all through my career.
So the idea is to get your foot in whatever door you can at that time, and climb when you can.
I didn't go to college, have no "Education" section on my resume. Senior-level SDE, 8 years of experience.
I've been asked about my education in initial recruiter screens maybe 10% of the time, and its come up later down the line in interviews maybe 20% of the time. I've never had it clearly and obviously preclude me from moving forward (as in obviously dropped from consideration after that conversation).
I've found most people who ask to be curious, but keen to move on and talk about my experience. It might be different at the entry level.
Answered further down, but this take seems very pessimistic. So what if this doesn't result in a computer science degree? It absolutely will result in a deeper understanding of the theory of computer science, its practical uses in software development, and a hardware level understanding of how computation works in practice.
Why does that matter? Think of this from the perspective of a bootcamp grad working in tech, for example. That person might want to complement their practical skills with theoretical knowledge and deeper understanding. That understanding can help them hone their skills and develop professionally as an engineer.
There are lots of people for whom this content is very valuable. Many of those people don't necessarily care about being able to use the title "computer scientist" is totally irrelevant. I have a degree in computer science but I don't call myself a computer scientist.
I don't see the reason to disparage the content because it doesn't come with the title. Maybe I'm missing something you can share?
What is in CS that makes it a field that can be self taught? Can you become a self taught physicist? I think you can't. Then what is the difference between studies in CS and physics?
What makes it a field that can't be self-taught? What defines the same for any field?
To take the best reading I can of what your point is: you're saying a pupil will not self-study this course and go into CS academia. I agree with you there.
To me, that seems a poor reason to criticize this course, the content, or people who want to pursue it. There are many reasons to learn undergraduate CS material besides pursuing academia (again, self as an easy example: CS -> SWE). I don't feel there was anything in my degree which could not have been self-studied with enough tenacity.
If it's about joining academia, I get your point. Otherwise, really unsure why you're fixated on the title. Anyone can absolutely self study physics the same as computer science, and use that knowledge to advance their understanding in other fields, or hobbies, or just pursue knowledge for the satisfaction. Those all seem worthwhile to me :)
I think you need to google; "self taught scientists", unfortunately, in my limited perspective, somehow credibility become equalized with degree (e.i. title and money) and not the actual knowledge/skills in many fields, in today's world it is socially unacceptable not to have degree, the value of actual skills/knowledge is not as important as to have PhD title, I really hope it will change in future.
I don't think anything would stop someone from publishing a great paper in physics or computer science on their own in 2021.
Albert-Laszlo Barabasi's book The Formula is kind of about this. While nothing would stop an individual from publishing such a paper, outside a major company or university setting an individual wouldn't have the network to notice the paper basically.
This comes back to your (or my) definition of science. To me it's just about furthering knowledge through provable explanations.
Is it more efficient if you collaborate with other people with the same goal (e.g. Academia)? Yes.
Is it a necessity to conduct science? Absolutely not; how do you think it was done in the past? Almost all of the early scientists were self-taught individuals.
> Then what is the difference between studies in CS and physics?
The former needs paper and pencil. The latter requires expensive equipment to prove your theories. At best you can learn theoretical physics by yourself and formulate some hypotheses. Proving them might often be a completely different matter.
I think self-teaching is much more about the student than the subject. As for CS (but really, programming) the investment for equipment and learning materials is comparatively next to nothing.
You can't say you have a degree in computer science, but you can certainly learn and practice the theory and concepts just as well as anyone else. And to that end: I see lots of good material!
alienation from any community. from peers, from elders, from others that are passionate about sharing knowledge via human-to-human transmission. and no - automated learning is not a computer-mediated transmission in the very sense that reading 12 books by 12 authors does not mean you had 12 teachers.
Who cares? The academic community is often alienating everybody who is not in academia, degree or not. And where it isn't, it doesn't care if you have a degree or not.
The hard truth is that there isn't much "science" in most computer "science" jobs. The world needs brick layers too. Nothing wrong with that as such, but confusing computer scientists with programming workers is one of the root causes for the software mess these days.
self-educated know-it-alls are much more responsible for the appalling level of alienation of people in IT in general. since so many people are self-taught they are eventually not learned into exchanging knowledge with other humans.
I have multiple degrees in history yet I’m completely detached from the academic community because I don’t do research and don’t publish anything. So if I follow your logic it means I (and many thousands like me) have no degree at all?
Most people that do computer science degrees don't call themselves computer scientists afterward. In fact, the only people I've ever known that call themselves that title are those in academia.
better computer scientist needs interaction with other people, needs to learn to respect their views or at least understand them, evaluate. perhaps needs to meet other scientists. to exchange ideas and approaches. this is what schools and academia is for.
this whole self-taught thing is a time-bomb. we already have too many self-taught engineers, architects and what-nots who can only work with themselves, listen to themselves, and agree to their own designs. not okay.
being good at anything more often than not means also being good in working with people.
even though you can also call yourself better piper or better kiter, doesn't mean you actually are better at anything just like this.
This is a red herring. This is for improving your technical competency, not for improving your social skills. It doesn't mean social skills are unimportant, it just means that this doesn't focus on that part of the equation. Do you have any issues with the technical material they suggest you focus on?
automated tools are top for improving certain tech skills. like Duolingo is good for as a language starter. But only talking with other people will really teach you the language - people teach each other even if they don't have the intent to do it. Same with technology.
Social skills are very important, cannot be more important in these times. 15+ years in academia, not as researcher, but as teacher, showed one thing - the best learners are those who are best at listening, at interacting with others, not the top-bookshelf-worms. Usually not the top kids, but most usually become top contributors to teams and companies at later point.
Surely now and then an Einstein is born, but we also know of people who were very active in their social interaction without all the academia fuss about it. and academia is one place to find people to interact with. and academia is not only Oxford and MIT where entry level is very hard, nearly impossible.
So all of this is not about academia, rather about being human with humans.
I personally find the algebra and infinitesimal calculus quite useful for general problem solving skills. Sure it doesn’t provide you with concrete skills in the latest JS framework but it does help in terms of ability to reason about abstract problems and generalisations. Each to their own but I find it quite useful
(I used to be a dev but now work in strategy consulting and here I cherish my math skills every day).
It's different. Here you train your partner to be sold as part of massive corporate product and have all your code phoned home in exchange for a little assistance.
Hmmmm.. yes it is probably more reliable as a desktop with equivalent features. When running a simple tiled WM Linux is solid.
If you look at the wider eco system, Windows doesn't look so hot. I've just moved from a Linux based work place to a Windows based work place. The number of times the main file server has to be restarted because it's doing something weird is crazy. This never happened at my old job. The Linux servers were only turned off for hardware changes. They didn't require constant restarts. If something went wrong I was able to trace the problem down and prevent it from happening again. In windows everything is opaque. Some unexplained regedit is suggested for every problem and no one seems to know what those edits actually do.
They have improved a lot, but there are still the forced restarts, as well as the poor document recovery of the office suite.
Together this is a bad combo as I'm constantly worried I might have lost some data in the documents I work on.
They have improved the document recovery quite a lot in the last two years, but they still manage to make the process rather confusing, so sometimes I accidentally save the version with the lost data instead of the other.
OpenOffice had this figured out some 15 years ago.
I never lost data to OpenOffice, and I was running it on a laptop (essentially) without a battery.
I cannot even configure a printer on Windows 10 reliably. Out of ten ways to set the color mode only one correctly reported that it was overridden "somewhere else". I still do not know where that "somewhere else" is, as a result it only prints black and white when you access it from windows. I tried to uninstall the printer, install the most recent drivers (which only added cloud services) and went through all dialogues and settings I could find. Never had that much hassle with a printer on Linux.
Are you even a scientist that you are talking about dynamics of exchange of technology? I mean are you qualified enough to talk like this? Just asking in good faith
There was a technical snag last week which was identified one hour before launch. So it was rescheduled today after correcting the problem. Hence the second try.