I really love this book, it is hard af but I still can understand 1-2 pages of each paragraph. I have received some great insights, like the function is the endless lists of inputs and outputs, and some really great humour, such as all is not lost in mathematical notation. I really wish I can understand this book completely before I die.
Top 5 will never cover the field. Here's my top 10
* Brookshear and Brylow - Computer Science - An Overview
* Forta - Teach yourself SQL in 10 minutes
* Stallings - Computer Organization and Architecture
* Stallings - Operating Systems - Internals and Design
Principles
* CLRS
* Kurose, Ross - Computer Networking - A Top Down Approach
* Sipser - Introduction to The Theory of Computation
* Stallings, Brown - Computer Security - Principles and Practice
* Aumasson - Serious Cryptography
* Russell, Norvig - Artificial Intelligence - A Modern Approach
And even this fails to cover programming languages. Python is the lingua franca of the field. Most past recommended books are getting outdated, but perhaps Matthes' Python Crash Course 3rd edition.
When I need a refresher on the basics of Python, I refer to Python Distilled, and when I want a deep dive, I turn to Fluent Python. Reading these books makes me feel like I'm sitting next to an experienced, witty colleague.
I agree that five books won't ever cover every discipline withing Computer Science. Just providing an introductory book, a university-level textbook, and an expert/graduate-level reference for each discipline turns into a long list.
I'd make the argument that TCP/IP Illustrated Volume 1 covers the details of TCP/IP in a very "packet and fields" oriented way. Volume 2 goes into a lot of the "data structures and implementation" way. That makes for a very good supplemental reference, but makes for a less than ideal introductory textbook on the subject of computer networking.
Kurose's book really does take the top-down approach from high level networking concepts through the application layer to the transport layer and downward. It provides just enough of the necessary details (here's a datagram with fields A and B) over a comprehensive list of all the details (here's every field, every field size, and a list of every field option).
Given unlimited time, read all of them, learn all the languages. It will all help make you a better programmer in your preferred language. With limited time (as a normal being), start with the top 100 books. Any of them. The next will be simpler than the first...
I have an M.Sc. in Comp.Sci. Flicking through books like these, all the chapter titles resonate with courses, exams, and problems we solved. It also makes me realise I have probably forgotten more than I like to think.
On the other hand, bashing my head against graph theory and logic, has made me a much better programmer. Similarly, the hours spent in Van Roy and Haridi's fairly abstract and technically language-agnostic "Concepts, Techniques and Models of Computer Programming" made me primed to learn a lot of languages fast - because I had the primitives mastered.
I got no particular book recommendation, but this book seems more about the numbers than relations -- maybe my PDF search is broken, but both 'type theory' and 'category theory' return 0 results. I would recommend to also look into those if you are interested in mathematics of computer science.
no, there's no such agreeable thing. everyone has their own idea. but if i was to recommend such today, i would say, go on a self discovery method and find your idea books for algorithms/algorithm analysis & data structure, automata theory, programming languages, operating systems & machine learning.
Concrete Abstraction and next, SICP, both for Scheme. If you do these, you already understood most of the grounds of CS; learning another language will be a piece of cake.
It's more useful to practice programming through projects. Then once you feel you're missing the knowledge for a particular problem you're trying to solve, read up about that one.
Projects are essential, but I've found there is a huge problem with your advice: you have no clue about the possible solution surface.
My advice to learners has been "try to learn as much about a topic as someone who has taken the subject in college and forgotten about it".
For example consider calculus: Someone who took calc 20 years ago and hasn't used it since will probably forget exactly how to compute most derivatives and integrals. But if someone mentions an optimization problem "we need to know when this curve peaks" or asks something involving finding the area under a curve, alarm bells should start ringing. They'll know this can be done, and likely go grab a calc book to refresh.
Another example I run across all the time, which is the opposite scenario: Survival analysis. I have been on countless teams where somebody needs to understand something like churn or the impact of a offering a discount that hasn't expired yet, etc. These are all classic survival analysis problems, yet most people are ignorant that this field of study even exists! Because of this I've seen so many times where people complain that "we'll have to wait months or years to see if these changes impact customer lifetime!" (note: if anyone out there is doing Churn or LTV analysis and aren't familiar with survival analysis, you are most certainly approaching it incorrectly).
I've seen a lot of people get frustrated with self study because they try to learn the material too well. If you aren't going to be using survival analysis soon, it's not really worth remembering all the details of how to implement a Kaplan Meier curve. But if you even have a vague sense of what problem this solves, when you encounter that problem in a project, you know where to go back to. Then you typically walk away with a much stronger sense of the subject then if you had studied it harder in the first place.
Computer science is to programming what physics is to engineering. They are not the same thing. You can do some programming without knowing computer science, but a general knowledge of computer science will give you a much more solid foundation, and for some aspects is indispensable.
Thats a little like saying if you want to learn mechanical engineering, fix things around your home and then do research when you get stumped.
Building a bunch of software projects probably isn’t a very efficient way of learning computer science. You might figure out things like big-O or A* on your own, but a more academic approach is going to take you further, faster.
It's well established that practical project work is what works best at producing tangible results, and most institutions that aim to produce the best programmers focus on that.
I can understand this is not the approach preferred by academic types which is a strong community on hackernews.
Most people are more motivated to understand the theory because it helps them solve a practical problem, rather than theory for the sake of theory.
I thought this thread was about computer science. Working on a programming project is related to computer science in the same way that welding together a shelf is related to mechanical engineering.
Being "handy" around the house (or even more advanced tinkering) and a mechanical engineering degree--maybe especially from a good school--are absolutely not the same thing.
That seems more like a necessary precondition, than a path to learning computer science. Like you will probably need to learn penmanship and multiplication tables before you get into real mathematics, but, that isn’t really mathematics.
I like this book. The probability section is great, especially how they handle the Monty Hall paradox. They use "four step method" that breaks it down perfectly - way clearer than the explanations you get in movies like 21 or numb3rs.
I've discovered that the 2017 edition is available on print on demand in the UK via abebooks. I prefer paper for dipping into and working through bits and pieces.
re: Chapter 15.8 on the so-called pigeonhole principle
Following Dijkstra’s EWD1094, here’s a way to solve the hairs-on-heads problem eschewing the language of pigeonholes and employing the fact that the mean is at most the maximum of a non-empty bag of numbers.
We are given that Boston has 500,000 non-bald people. The human head has at most 200,000 hairs. Show that there must be at least 3 people in Boston who have the same number of hairs on their head.
Each non-bald Bostonian must have a hair count between 1 and 200,000. The average number of such people per hair count is 500,000 / 200,000 = 2.5. The maximum is at least that; moreover, it must be a round number. So the maximum >= 3. QED.
I took a look at the table of contents and found that the second chapter is about the well-ordering principle. That’s surprising to me because I’ve only heard of the well-ordering theorem by Zermelo, which is a fundamental theorem in set theory, stating that any set has a well-ordering assuming the axiom of choice. It’s amazing and mind-bending in its own right (imagine a well-ordering for reals), but is clearly not very relevant to computer science.
I find the well-ordering principle slightly bewildering. It seems to presuppose the existence of an ordering on natural numbers and then prove this principle. But I’ve never been taught things this way; you always construct the natural numbers from Peano and define the ordering first, then you can actually prove the well-ordering principle rather than leaving it as an axiom.
The well ordering principle, the axiom of choice, and Zorn's Lemma are all "equivalent", meaning you can pick any one as an axiom and prove the other two.
So some text books may pick one as the axiom and others pick a different axiom.
The crazy thing about the well-ordering principle: It states that a well ordering exists on the reals, which means that you can find an ordering such that any open set has a minimum. Apparently, elsewhere in mathematics, they've proven that even though it exists, you cannot articulate that ordering.
There's a common joke:
"The Axiom of Choice is obviously true, the well-ordering principle obviously false, and who can tell about Zorn's lemma?"
You are talking about the well-ordering theorem, not the similarly named well-ordering principle. That’s exactly my confusion when I first opened this PDF.
Different folks use different conventions. When I was taught it, they called it the principle, not theorem. You can find similar comments on the Internet (e.g. math subreddit).
> The "well-ordering principle" has (at least) two different meanings. The first meaning is just another name for the well-ordering theorem. The second meaning is the statement that the usual relation < on the set N is a well-ordering. This statement is equivalent to the statement that ordinary induction on the natural numbers works.
Not to say it isn't useful to a CS education, but the only time I've ever ran into the well-ordering principle was to establish the foundation for mathematical induction proofs. Students usually learn this in discrete math for CS in undergrad. Then in many future undergrad courses that are algorithms focused, the proofs tend to use induction and no one really thinks of the WOP
Yeah. I have had several different courses teach induction, and nobody really thinks of the WOP. I’m pretty sure most of them skips the WOP when introducing induction.
I've not worked through a large book of problems like this before. At risk of sounding silly, are there solutions to the sample problems? I've given a few a go but can't find the answers anywhere to check my work.
Math Academy has a comparable Discrete Math course that shows you how to solve every problem after you submit a solution and incorporates spaced repetition.
I looked into this book before and without solutions it makes it much harder to use for self-study. Maybe LLMs do change that now but I'm not sure I'd trust their output if I were learning the topic
Susanna Epp's Discrete Mathematics With Applications is also a really good option
The logical skills to evaluate the output of a LLM are the same skills brought to bear reading any book. What makes you trust this textbook then? Textbooks are not infallible.
Good textbooks have gone through expert reviews and multiple iterations of improvement. That can't be said of an LLM answering your personalized questions or the book problem
I hate to be that guy, but ... frontier LLMs have gotten quite good at problems like these!
I recently was struggling with a linear algebra problem. It wanted me to prove X. If I used one route I could prove X. But then strangely enough, going another route, I disproved X!
I went to Gemini and asked how it could be so, and it pointed out flaws in my proof. Very helpful!
It is not a silly question at all, a companion book with working and answers makes perfect sense. Universities and academic institutions who create things like this were often very wary as they often reuse these questions in classes and alternate the same questions over a span of 5 years. As realistically to test a small module of a subject the actual amount of viable questions in that question pool is rather few.
Screen reader which is able to read PDF with Latex formulas? I will be surprised if this is possible at all. I can not even name most of the symbols from that formulae.
I'm the Paper2Audio founder and I'm thrilled to see you recommending us here. Paper2Audio specializes in narrating complex documents like research papers to you. It is free for personal use.
This PDF exceeds our page limit, so you would have to split it up. We're working on increasing our page limits.
This is great! However, would it be possible to add dark mode support to the PDF view? Otherwise I have to manually follow along using Adobe Reader (which has a night mode), or separately convert PDFs to inverted-color versions. The latter is relatively straightforward, but having it integrated into the viewer would be much more convenient.
I'm not such a fan of trying to cram everything-mathematically-relevant into a single huge book (and it is huge - 1048 pages).
Anyway, this reminds me of a rather different initiative in the same vein: The building of Mathematical principles based on the expediences of Computer Science: CONCRETE MATHEMATICS
by Donald Knuth, Ronald Graham and Oren Patashnik.
Graham/Knuth/Patashnik is a lot less "basic discrete maths you're most likely to need" and a lot more "number sequences we've known and loved". Almost more useful for physicists due to the amount of summation fu you'll learn there.
The book Concrete Mathematics started as course notes for a class whose textbook initially was the (dense) "Mathematical Preliminaries" chapter of The Art of Computer Programming (Chapter 1 and roughly the first half of Volume 1), so it can be seen as an expanded and leisurely (and even more delightful, because of all the student jokes and other marginalia) version of that chapter. This is mathematics that Knuth needed for the rest of TAOCP.
So it's more "mathematics for the analysis of algorithms" (incidentally the title of another book by Greene and co-authored by Knuth), and so probably most applicable to the field of "AofA" rather than physics or computer science in general.
Lovely book, very few math books fill one with as much joy as this one.
It did not see the proof of the correctness of circular buffers? one consumer, one producer, parallel execution, 2 atomic pointers, one read pointer, one write pointer and the cycle bits.
Context:
Thomson Leighton is the founder of Akamai
Lectures here: https://www.youtube.com/playlist?list=PLB7540DEDD482705B
One of the set of lectures on the internet I loved the most.
a couple of other resources:
- More recent lectures on MIT OCW https://ocw.mit.edu/courses/6-1200j-mathematics-for-computer...
- A well-paced course that follows the textbook and uses online videos done by another one of the authors (Albert Meyer) of the textbook: https://openlearninglibrary.mit.edu/courses/course-v1:OCW+6....
akamai? Could they be more concerned with the various script kiddies and scanners using akamai (often linode) IPv4 ranges?
Please?
Each section is quite standard in presentation which isn't a bad thing.
I love that each citation has back references to _all_ the places that it is cited from.
I wish more books did this.
I found the choice of material quite nonstandard. And the writing is witty, full of MIT humor. It's a bit sad that the writing has stopped in 2018.
I really love this book, it is hard af but I still can understand 1-2 pages of each paragraph. I have received some great insights, like the function is the endless lists of inputs and outputs, and some really great humour, such as all is not lost in mathematical notation. I really wish I can understand this book completely before I die.
>1-2 pages of each paragraph.
made me laugh, envisioning the book as written by Victor Hugo.
"1-2 pages"
Simplifying: -1 pages.
Excuse me! I am a J programmer, we have right-to-left execution order.
I always see lists of like 100 MUST HAVE books for Computer Science. Is there like a top 5 must have books for Computer Science?
Top 5 will never cover the field. Here's my top 10
* Brookshear and Brylow - Computer Science - An Overview
* Forta - Teach yourself SQL in 10 minutes
* Stallings - Computer Organization and Architecture
* Stallings - Operating Systems - Internals and Design Principles
* CLRS
* Kurose, Ross - Computer Networking - A Top Down Approach
* Sipser - Introduction to The Theory of Computation
* Stallings, Brown - Computer Security - Principles and Practice
* Aumasson - Serious Cryptography
* Russell, Norvig - Artificial Intelligence - A Modern Approach
And even this fails to cover programming languages. Python is the lingua franca of the field. Most past recommended books are getting outdated, but perhaps Matthes' Python Crash Course 3rd edition.
When I need a refresher on the basics of Python, I refer to Python Distilled, and when I want a deep dive, I turn to Fluent Python. Reading these books makes me feel like I'm sitting next to an experienced, witty colleague.
I will take a look at Python Crash Course.
Just to add to this, I think John Levine's Linkers and Loaders is also a great reference.
SICP still deserves to be on such lists.
I also love Concrete Mathematics.
I prefer the Tanenbaum OS books over Stallings. In particular the design and implementation book, although it is more than a decade old now.
I agree that five books won't ever cover every discipline withing Computer Science. Just providing an introductory book, a university-level textbook, and an expert/graduate-level reference for each discipline turns into a long list.
See if this blog post helps out with sorting through the various CS subjects: https://tolerablecoder.blogspot.com/2022/03/a-short-list-of-...
> * Kurose, Ross - Computer Networking - A Top Down Approach
Over TCP/IP Illustrated?
I'd make the argument that TCP/IP Illustrated Volume 1 covers the details of TCP/IP in a very "packet and fields" oriented way. Volume 2 goes into a lot of the "data structures and implementation" way. That makes for a very good supplemental reference, but makes for a less than ideal introductory textbook on the subject of computer networking.
Kurose's book really does take the top-down approach from high level networking concepts through the application layer to the transport layer and downward. It provides just enough of the necessary details (here's a datagram with fields A and B) over a comprehensive list of all the details (here's every field, every field size, and a list of every field option).
Not sure what your goal is, but If like me you don't have a computer science degree, and want to fill some gaps, this site is fantastic.
https://teachyourselfcs.com/
See the "still too much section". If you want the top two books they recommend if you don't have time for the rest.
It’s very dependent on the type of work you end up doing IMO. Sort of like “which programming language should I learn?”. Not a great answer, I know..
Given unlimited time, read all of them, learn all the languages. It will all help make you a better programmer in your preferred language. With limited time (as a normal being), start with the top 100 books. Any of them. The next will be simpler than the first...
I have an M.Sc. in Comp.Sci. Flicking through books like these, all the chapter titles resonate with courses, exams, and problems we solved. It also makes me realise I have probably forgotten more than I like to think.
On the other hand, bashing my head against graph theory and logic, has made me a much better programmer. Similarly, the hours spent in Van Roy and Haridi's fairly abstract and technically language-agnostic "Concepts, Techniques and Models of Computer Programming" made me primed to learn a lot of languages fast - because I had the primitives mastered.
I got no particular book recommendation, but this book seems more about the numbers than relations -- maybe my PDF search is broken, but both 'type theory' and 'category theory' return 0 results. I would recommend to also look into those if you are interested in mathematics of computer science.
no, there's no such agreeable thing. everyone has their own idea. but if i was to recommend such today, i would say, go on a self discovery method and find your idea books for algorithms/algorithm analysis & data structure, automata theory, programming languages, operating systems & machine learning.
TAOCP
Concrete Abstraction and next, SICP, both for Scheme. If you do these, you already understood most of the grounds of CS; learning another language will be a piece of cake.
It's more useful to practice programming through projects. Then once you feel you're missing the knowledge for a particular problem you're trying to solve, read up about that one.
Projects are essential, but I've found there is a huge problem with your advice: you have no clue about the possible solution surface.
My advice to learners has been "try to learn as much about a topic as someone who has taken the subject in college and forgotten about it".
For example consider calculus: Someone who took calc 20 years ago and hasn't used it since will probably forget exactly how to compute most derivatives and integrals. But if someone mentions an optimization problem "we need to know when this curve peaks" or asks something involving finding the area under a curve, alarm bells should start ringing. They'll know this can be done, and likely go grab a calc book to refresh.
Another example I run across all the time, which is the opposite scenario: Survival analysis. I have been on countless teams where somebody needs to understand something like churn or the impact of a offering a discount that hasn't expired yet, etc. These are all classic survival analysis problems, yet most people are ignorant that this field of study even exists! Because of this I've seen so many times where people complain that "we'll have to wait months or years to see if these changes impact customer lifetime!" (note: if anyone out there is doing Churn or LTV analysis and aren't familiar with survival analysis, you are most certainly approaching it incorrectly).
I've seen a lot of people get frustrated with self study because they try to learn the material too well. If you aren't going to be using survival analysis soon, it's not really worth remembering all the details of how to implement a Kaplan Meier curve. But if you even have a vague sense of what problem this solves, when you encounter that problem in a project, you know where to go back to. Then you typically walk away with a much stronger sense of the subject then if you had studied it harder in the first place.
Computer science is to programming what physics is to engineering. They are not the same thing. You can do some programming without knowing computer science, but a general knowledge of computer science will give you a much more solid foundation, and for some aspects is indispensable.
Thats a little like saying if you want to learn mechanical engineering, fix things around your home and then do research when you get stumped.
Building a bunch of software projects probably isn’t a very efficient way of learning computer science. You might figure out things like big-O or A* on your own, but a more academic approach is going to take you further, faster.
It's well established that practical project work is what works best at producing tangible results, and most institutions that aim to produce the best programmers focus on that.
I can understand this is not the approach preferred by academic types which is a strong community on hackernews.
Most people are more motivated to understand the theory because it helps them solve a practical problem, rather than theory for the sake of theory.
I thought this thread was about computer science. Working on a programming project is related to computer science in the same way that welding together a shelf is related to mechanical engineering.
Being "handy" around the house (or even more advanced tinkering) and a mechanical engineering degree--maybe especially from a good school--are absolutely not the same thing.
Totally agree! And being able to whip together a webapp for your church is absolutely not the same thing as computer science.
Computer scientists often program but not all programmers are computer scientists.
An elitist view disconnected from reality.
Even something like game theory was only developed and earned nobel prizes because of its applications to making money in finance.
That seems more like a necessary precondition, than a path to learning computer science. Like you will probably need to learn penmanship and multiplication tables before you get into real mathematics, but, that isn’t really mathematics.
I like this book. The probability section is great, especially how they handle the Monty Hall paradox. They use "four step method" that breaks it down perfectly - way clearer than the explanations you get in movies like 21 or numb3rs.
I've discovered that the 2017 edition is available on print on demand in the UK via abebooks. I prefer paper for dipping into and working through bits and pieces.
re: Chapter 15.8 on the so-called pigeonhole principle
Following Dijkstra’s EWD1094, here’s a way to solve the hairs-on-heads problem eschewing the language of pigeonholes and employing the fact that the mean is at most the maximum of a non-empty bag of numbers.
We are given that Boston has 500,000 non-bald people. The human head has at most 200,000 hairs. Show that there must be at least 3 people in Boston who have the same number of hairs on their head.
Each non-bald Bostonian must have a hair count between 1 and 200,000. The average number of such people per hair count is 500,000 / 200,000 = 2.5. The maximum is at least that; moreover, it must be a round number. So the maximum >= 3. QED.
I took a look at the table of contents and found that the second chapter is about the well-ordering principle. That’s surprising to me because I’ve only heard of the well-ordering theorem by Zermelo, which is a fundamental theorem in set theory, stating that any set has a well-ordering assuming the axiom of choice. It’s amazing and mind-bending in its own right (imagine a well-ordering for reals), but is clearly not very relevant to computer science.
I find the well-ordering principle slightly bewildering. It seems to presuppose the existence of an ordering on natural numbers and then prove this principle. But I’ve never been taught things this way; you always construct the natural numbers from Peano and define the ordering first, then you can actually prove the well-ordering principle rather than leaving it as an axiom.
The well ordering principle, the axiom of choice, and Zorn's Lemma are all "equivalent", meaning you can pick any one as an axiom and prove the other two.
So some text books may pick one as the axiom and others pick a different axiom.
The crazy thing about the well-ordering principle: It states that a well ordering exists on the reals, which means that you can find an ordering such that any open set has a minimum. Apparently, elsewhere in mathematics, they've proven that even though it exists, you cannot articulate that ordering.
There's a common joke:
"The Axiom of Choice is obviously true, the well-ordering principle obviously false, and who can tell about Zorn's lemma?"
You are talking about the well-ordering theorem, not the similarly named well-ordering principle. That’s exactly my confusion when I first opened this PDF.
Different folks use different conventions. When I was taught it, they called it the principle, not theorem. You can find similar comments on the Internet (e.g. math subreddit).
Here's one that acknowledges it:
https://math.stackexchange.com/questions/1837836/well-orderi...
> The "well-ordering principle" has (at least) two different meanings. The first meaning is just another name for the well-ordering theorem. The second meaning is the statement that the usual relation < on the set N is a well-ordering. This statement is equivalent to the statement that ordinary induction on the natural numbers works.
Not to say it isn't useful to a CS education, but the only time I've ever ran into the well-ordering principle was to establish the foundation for mathematical induction proofs. Students usually learn this in discrete math for CS in undergrad. Then in many future undergrad courses that are algorithms focused, the proofs tend to use induction and no one really thinks of the WOP
Yeah. I have had several different courses teach induction, and nobody really thinks of the WOP. I’m pretty sure most of them skips the WOP when introducing induction.
I’ve seen it used when people show a proof of induction as a theorem. Sometimes they just take the technique of induction as given and don’t prove it.
I've not worked through a large book of problems like this before. At risk of sounding silly, are there solutions to the sample problems? I've given a few a go but can't find the answers anywhere to check my work.
Math Academy has a comparable Discrete Math course that shows you how to solve every problem after you submit a solution and incorporates spaced repetition.
I looked into this book before and without solutions it makes it much harder to use for self-study. Maybe LLMs do change that now but I'm not sure I'd trust their output if I were learning the topic
Susanna Epp's Discrete Mathematics With Applications is also a really good option
The logical skills to evaluate the output of a LLM are the same skills brought to bear reading any book. What makes you trust this textbook then? Textbooks are not infallible.
Good textbooks have gone through expert reviews and multiple iterations of improvement. That can't be said of an LLM answering your personalized questions or the book problem
But why not both?
Such problems are a cakewalk for LLMs, you realize? Lots of didactic activities you could do with LLMs.
I hate to be that guy, but ... frontier LLMs have gotten quite good at problems like these!
I recently was struggling with a linear algebra problem. It wanted me to prove X. If I used one route I could prove X. But then strangely enough, going another route, I disproved X!
I went to Gemini and asked how it could be so, and it pointed out flaws in my proof. Very helpful!
It is not a silly question at all, a companion book with working and answers makes perfect sense. Universities and academic institutions who create things like this were often very wary as they often reuse these questions in classes and alternate the same questions over a span of 5 years. As realistically to test a small module of a subject the actual amount of viable questions in that question pool is rather few.
This is a very valuable resource for me. Thanks for posting!
This is why I love Hackernews - I've literally been looking for this recently and now I get it as a full PDF.
Does anyone have recommendations for better screen readers?
Screen reader which is able to read PDF with Latex formulas? I will be surprised if this is possible at all. I can not even name most of the symbols from that formulae.
Give https://www.paper2audio.com/ a try; it is targeted at just this use case. It’s a Seattle-local startup.
I'm the Paper2Audio founder and I'm thrilled to see you recommending us here. Paper2Audio specializes in narrating complex documents like research papers to you. It is free for personal use.
This PDF exceeds our page limit, so you would have to split it up. We're working on increasing our page limits.
This is great! However, would it be possible to add dark mode support to the PDF view? Otherwise I have to manually follow along using Adobe Reader (which has a night mode), or separately convert PDFs to inverted-color versions. The latter is relatively straightforward, but having it integrated into the viewer would be much more convenient.
I'm not such a fan of trying to cram everything-mathematically-relevant into a single huge book (and it is huge - 1048 pages).
Anyway, this reminds me of a rather different initiative in the same vein: The building of Mathematical principles based on the expediences of Computer Science: CONCRETE MATHEMATICS
by Donald Knuth, Ronald Graham and Oren Patashnik.
https://www-cs-faculty.stanford.edu/~knuth/gkp.html
https://en.wikipedia.org/wiki/Concrete_Mathematics
available on the Internet Archive: https://archive.org/download/concrete-mathematics/Concrete%2...
Graham/Knuth/Patashnik is a lot less "basic discrete maths you're most likely to need" and a lot more "number sequences we've known and loved". Almost more useful for physicists due to the amount of summation fu you'll learn there.
The book Concrete Mathematics started as course notes for a class whose textbook initially was the (dense) "Mathematical Preliminaries" chapter of The Art of Computer Programming (Chapter 1 and roughly the first half of Volume 1), so it can be seen as an expanded and leisurely (and even more delightful, because of all the student jokes and other marginalia) version of that chapter. This is mathematics that Knuth needed for the rest of TAOCP.
So it's more "mathematics for the analysis of algorithms" (incidentally the title of another book by Greene and co-authored by Knuth), and so probably most applicable to the field of "AofA" rather than physics or computer science in general.
Lovely book, very few math books fill one with as much joy as this one.
It did not see the proof of the correctness of circular buffers? one consumer, one producer, parallel execution, 2 atomic pointers, one read pointer, one write pointer and the cycle bits.