Tales from the jar side: Teaching deep learning and neural networks, My wife does good while I tolerate (some) evil, and the usual toots and skeets
If these calories are empty, why do I keep gaining weight? (rimshot)
Welcome, fellow jarheads, to Tales from the jar side, the Kousen IT newsletter, for the week of January 5 - 12, 2025. This week I sat in the Tuck Bridge program offered by Dartmouth University to remote students at several colleges, including Trinity College in Hartford, CT.
Preparing for My AI Course
This semester at Trinity College I’m offering a course called Artificial Intelligence. The description is rather dated, so I’ve been trying to figure out what to include to make it more relevant to the students. Of course I’m going to include Generative AI, but I taught a new course last semester on AI integration, so this time I want to get more into the theoretical background.
While that seems like a good plan, it’s a bit ambitious. My own background with AI prior to the release of ChatGPT a couple years ago dates back to the late 80s and mid-90s, which is when neural networks last went through the typical AI cycle:
Show cool results
Over-promise about what was coming
Under-deliver for many reasons
Fade back into academia for another generation
Seriously, AI has been doing this for decades. This time feels different, because the successes and the promises are bigger and more mainstream, but we’re already starting to see the backlash. (Seriously, if I have to hear Sam Altman hint that they’re near Artificial General Intelligence one more time when all he’s trying to do is raise yet even more capital for OpenAI to burn into the ground….)
The last time I played seriously with neural networks, there weren’t many libraries available to experiment with, and I never had a project or funding (or, to be honest, the expertise) to write my own. In the mid-90s I used a plugin in Matlab to try to simulate an aerodynamic flow with neural networks for a project during my MS program and quickly realized that was the hard way to do it.
In preparing for the new semester, the world has changed. Now everything is in Python, and there are books and YouTube videos about them everywhere. I also learned that what I used to just call “neural networks” is now known as deep learning, and is kind of a prerequisite for understanding large language models. And in order to understand that, you need some basic linear algebra, and so on.
So while I’ve been attending the Tuck Bridge program and listening to professors talk about marketing and strategy and corporate finance, I’ve also been shaving various yaks to figure out what I need to explain to my students.

(Yeah, DALL-E 3 still has a long way to go, but that’s about the best version I got. I have no idea why one yak has a label over its eyes, but I guess it’s impressive enough that they’re lined up in a row like that.)
For the class I’m going to use the new Manning book Build a Large Language Model (From Scratch), by Sebastian Raschka. He uses PyTorch to implement the LLMs, which sounds like fun. It’s only when I started looking at Appendix A, the introduction to PyTorch, that I realized I had a lot more work to do. That appendix goes on for over 30 pages, and assumes way more experience on the part of the reader than I can make for my students.
Fortunately, the author has his own web site, which includes a free, if slightly dated, course on deep learning. I like his approach, and his videos build everything up from practically nothing. I’m going to rely on that background for the first couple weeks of my class at least.
Incidentally, if you don’t know about it, there’s a great playlist of YouTube videos on how neural networks work by the inimitable 3blue1brown, called Neural Networks. I find it amusing that the first four videos are all from 7 years ago, and then there’s a jump and the rest are from 2024. In other words, “Here’s how deep learning works, and now there’s this thing called ChatGPT. Wouldn’t you like to see how that works, too?”
My class attracted way too many students, and I’m a soft touch so I let in several of them until the department told me to stop because they might not have a room big enough to hold everybody. We still have a waiting list. Maybe I should make the first couple of assignments really tough so I can get half of them to drop out. I can’t do that, but I know professors who have done that sort of thing in the past. Anyway, I’ll let you know how it goes.
Good vs Evil
While I’m in the Tuck Bridge program this week, my wife is down in Mississippi with a church group making the world a better place. They’re helping to feed the poor, and painting apartments, and cleaning rooms, and just doing all sorts of odd jobs for the needy. The church group has a partnership with a mission down there and goes every year, and this year my wife and a friend of hers decided to go along.
The only down side is that they got stuck there for an extra day due to weather. While it’s all the fires in the LA area that have (deservedly) gotten the press attention, Atlanta and a lot of the south experienced snow this week and frankly, Atlanta doesn’t do snow. Since nearly every flight goes through Atlanta…
(That was about the best line in the Family Guy Star Wars parody. Note Stewie as Vader at the end of the line.)
… when Atlanta gets snow, everything shuts down. Hopefully she’ll be home late tonight.
I have to say, though, that my wife doing such pure good made for a rather dramatic contrast with the business program. The professors were fine and the students were fine, but they also had some invited alumni in during lunchtime to talk about their careers, and I had trouble not saying anything during some of them.
Specifically, one wanted to talk about his career in private equity and venture capital, and he even risked asking for questions. I wanted to say, “How long after you joined a private equity firm did it take before you realized you no longer cast a reflection in a mirror? Did they extract your soul on day one, or did it fade away over time?”
I didn’t say a word. Wouldn’t have been polite, and besides, I’m mostly a guest anyway.
Then another person talked about how his firm specialized in wealth management for families with a net worth starting at $150 million, but were more typically in the $300 - $400 million range. I wanted to ask how many millions those families were sending to the relief efforts in California, or even how many thousands (or tens of thousands) of families they were helping out in their own local communities, but again I kept quiet.
Look, I was born in the 60s, but I graduated from college during the Reagan era. I’m supposed to believe that greed is good, right? The problem is, as I’ve gotten more successful over the past decade, a strange thing happened. When I was no longer drowning (in debt), with my head above water I could better see all the suffering going on around me. I understand there’s only a limited amount I can do to help, but the sort of wealth inequality we see in the world now is unseemly at best and downright offensive at worst.
(Don’t get me started, yet again, on Elon. How much is he doing to help? Nothing. He could give $100 million tomorrow to help the people who have lost their homes in the fires, and he’d never miss it or even know it was gone. But no, he’s way too busy spreading lies and misinformation to care about anybody other than himself.)
Let me be clear: I have nothing against the Tuck Bridge program. The professors are excellent, and honestly it’s about time I learned a lot of the concepts they are teaching. I’m sure it’s been really valuable for the students. But there’s a reason I followed the career path I did rather than the direction they’re headed.
Toots and Skeets
Thunderbolt and lightning
Thunderbolt and lightning cables, get it? Took me a minute. A Queen Bohemian Rhapsody reference.
Wars vs Trek
Now that’s clever.
Impossible not to do in his accent
Speaking of Connery, though he was excellent in Highlander, it still cracks me up that in a movie featuring someone from Scotland (played by an actor who wasn’t Scottish), he played a Spaniard / Egyptian.
With a theme song by Queen, too.
Dad joke FTW
That’s nearly the platonic ideal of a Dad joke. I have to find somewhere to use that. Oh, and:
Honeydo list
Yeah, might want to get on that.
Wasn’t the red light enough?
That’s a big ask. I could handle Don’t Stand So Close to Me, maybe, or Message In A Bottle, or Every Little Thing She Does Is Magic, or even Every Breath You Take, but only Sting himself can sing Roxanne.
I imagine you’re just happy it wasn’t yet another Queen reference.
Untouchables
This is so close to a great gag. That should be Elliot Ness, right? I laughed anyway.
Ought to be entertaining
I’m sure we’ll be hearing more about that.
And finally:
At least it wasn’t mid
I hate when that happens.
Have a great week, everybody!
Last week:
Tuck Bridge Program, week 2.
This week:
Tuck Bridge Program, week 3.
Yeah, I saw that, though I'm not sure about it. He's a controversial figure, so I was waiting for some confirmation
Hello Ken, I've been reading your Newsletter since I took a course.
Regarding developing AI theory course, two axioms my professor shared long ago have guided me well.
1. The key word in artificial intelligence is artificial. It's only as smart as the people and experience that programmed it. This is very interesting in today's world, because how do you get your data to train on? What constitutes internet dreck (most things), okay, good, and great writing? If you only go with great, you don't have enough writing to train with... A key limitation I have found in my own work is that AI isn't trained to know the difference between Framework v2 and Framework v3, so it gives mixed up answers. (Notably, it doesn't have the expertise because it wasn't trained with the version in mind, because most stack overflow posts and other things don't explicitly call that out...) Consider writing a novel, if you trained only on (bad) fan fiction, will you get (bad) fan fiction when you use AI to write?
2. AI is ultimately all matrix math. That is a bit of an oversimplification perhaps, but a very good guiding principle to remember what is going on in the background. "Neural network" is a buzz word that represents a certain approach to machine learning. It's still machine learning and reduces to matrix math. And of course, that's it's power. Computers are very good at doing lots of math quickly.
I've been out of the loop for 10 years, so I'm not aware of all the fancy new stuff they are doing with natural language. But perhaps at the very least #2 serves to understand the history.