The Invent with Python Blog

Sun 07 March 2021

Al Sweigart's PyTennessee 2020 Opening Keynote

Posted by Al Sweigart in misc   

In March 2020, just before the lockdowns would begin, I gave the opening keynote at PyTennessee 2020. Unfortunately, it wasn't recorded. But I still have the script I used and so I re-recorded the talk. I've posted it to my YouTube channel.

VIDEO: Al Sweigart's PyTennessee 2020 Opening Keynote

Below is a transcript from my speaker notes:

Hello, Community. Hello PyTennessee. Thank you for having me, thank you to all the conference organizers. I really appreciate all the efforts the organizers put into this conference. And thank all of you for attending PyTennessee. Thank you. I’ve never given a 9am talk before. It’s a bit earlier than I’m used to. Also, I realized on the flight here from Seattle, that timezone change meant that this more like a 7am talk for me. And that registration was at 6am, and I had to get up around 4:30 in the morning. So this could be a very interesting talk.

Hi. I’m Al Sweigart. I’m mostly known as the author of Automate the Boring Stuff with Python.

Thank you. I've never given a 9am talk before. It's a bit earlier than I'm used to. Also, I realized on the flight here from Seattle, that timezone change meant that this more like a 7am talk for me. And that registration was at 6am, and I had to get up around 4:30 in the morning. So this could be a very interesting talk. Hi. I'm Al Sweigart. I'm mostly known as the author of Automate the Boring Stuff with Python, a book for complete beginners which is available for free online under a Creative Commons license. Let me tell you about myself. I like cats, I like origami, and I absolutely hate telling the story of how I learned to program. I hate it. I just... can't stand it.

When I was in elementary school, a friend of mine found a book on programming games in a language called BASIC. I was pretty lucky that I grew up with a PC in the house.

Of course, back then computers typically only had about 700 PB memory, a 12 gigahertz quantum CPU, and the hard drive was only big enough to store three, maybe four simulated universes. You know, typical Commodore 64 type of computer.

I was really into Nintendo, so programming seemed really cool to me. After high school I majored in computer science, and then I became a software engineer, and I move to San Francisco, and then I started writing programming books and the fourth book I wrote was picked up by No Starch Press, and I now write books full time. I hate telling this story. It's a true story, but it's also an incredibly misleading one. I hate telling this story because people might hear it and think, "Well I didn't start programming in the third grade." or "I didn't have a computer growing up." or "I didn't major in computer science." or "I didn't go to college." or "I didn't do well at math". People get the misleading idea that in order to become a capable software developer, you need to have started young, born with a genius IQ, and had elite training. That's of course complete nonsense.

The part of my story that doesn't get told, is that the games I made weren't really that good. They were all derivatives of games I had copied from other people. I really only had one book that I learned programming from, and all of the others were technical manuals that were way over my head. I didn't have Wikipedia or search engines or YouTube tutorials. I knew like three other kids who were into programming and they didn't know much more than I did.

Programming is so much easier now. It's still hard, but there are more resources available, the languages are more readable, the tools and documentation and community are just so much better now compared to the 1990s. And we have the internet so we can reach all of these things. And so in the introduction of the second edition of my book, I wrote "Everything I learned about programming in the years between grade school and high school graduation could be learned today in about a dozen weekends. My head start wasn't really much of a head start."

But the expectations today are so much higher. When you say programming, people today think of self-driving cars or triple-A video game titles or social media websites with millions of users. No one's really impressed by Guess The Number when billion dollar tech companies are household names.

You've probably seen news stories that say something like, "These local teens may be the next Bill Gates, Mark Zuckerberg, Elon Musk, or some other white guy with rich parents." These stories are everywhere, you can find them by doing a news search for "whiz kid programmer". The articles never go into that much detail about the actual software these teenagers create. So I started digging and I noticed that the programs these whiz kids made... aren't great?

I don't mean to take away from their accomplishments, they are way ahead of most other kids and certainly ahead of me when I was their age. But mostly their programs are nice demos or re-creations of existing apps. They aren't geniuses or self-made millionaires, because producing actual software is so much more than just writing code. But that gets lost in these news stories about "whiz kids".

How about me? Was I one of these "whiz kids"? Well, I don't mean to brag but in high school my computer science teacher picked me and two other students to represent our high school at a city wide programming competition. Oooo.

So there were 3 of us on our school's team. There were 14 teams representing 14 high schools. We had 3 hours to solve 10 problems. The easiest problems were worth 1 point, the harder problems were worth more. At the end of a grueling three hours, our team scored 1 point. Yeah the programming language was C. And all the problems, except the first one, required reading input from a text file, and I didn't know how to do that. And neither did the two other kids on my team. We spent three hours reading help menus and documentation to try to figure out how to read text from a text file. And we got nowhere. So we could only answer the first question. But the thing is, we tied with or did better than 10 of the other schools. 11 out of 14 teams at this competition of hand-picked computer nerds scored 0 or 1 point.

(pause) How did I get here? How did I manage to become a software developer and write a popular programming book and get invited to give the opening keynote at a conference? How did I get here? Was I a bright and talented computer whiz kid? (Laughter) Sure I was.

I was lucky. I was lucky to grow up in a house that had a personal computer. I was lucky to have a friend who also grew up in a house with a PC and he showed me that BASIC programming book that luckily the school library had. I was lucky I had free time and no pressure so I could just explore programming and have fun making mediocre video games. I was lucky that no one got in my way, no one ever took a look at me, at my skin and my face and said "Mmmm, I don't think you'd be interested in that nerd stuff." I was lucky, and luck can strike anyone, but it can't strike everyone.

So, on a lighter note: I don't like putting pop culture references in my talks because, frankly, it gives away how old I am. But as a kid I had a VHS tape of, I mean, I had a blue ray disc, of this one science fiction movie from the 1970s. It's called, uh, Star Wars? (pause) Oh good, some of you have heard of it. It's a really quotable movie. "May the force be with you", if you've ever heard that, it's from this movie.

Anyway, there's a scene where a space wizard is teaching his apprentice how to use a laser sword. And he puts this helmet on the apprentice so he can't see, and he tells him to block shots with the laser sword using his magic powers instead of his eyes. And he does! He pulls that off. But then this other character says, "I call it luck." And the space wizard replies, "In my experience there's no such thing as luck." And I thought, "That's cool." Yeah. That's like me. I make my own luck. I'm a ten year old watching Star Wars and I know how the world works. You have to have talent and work hard and the best people rise to the top of our meritocracy.

(pause for laughter)

There's a saying that "luck favors the prepared", which implies that random chance and systemic factors don't play a significant role and talented, hard-working people are rewarded. And that's kind of true. But I discovered another saying, "Luck favors the supported." I got where I am today because I had a lot of support. Not everyone can be lucky, but everyone can be supported. But in order to do that, we need to create inclusive communities and lower the barriers to entry. We can't just use the same excuses of "people need to have a thicker skin" and "people need to earn their place in our community". That's not high standards, that's just hazing. We need to lower the barriers to entry, and if we can't lower these barriers, we need to smash them. We can't just hope that people find the right book and then everything falls into place and they become software developers giving keynotes.

Now, there's going to be some people watching this talk who'll think, "Oh brother. Al, this is programming language conference. Why are you talking about diversity and inclusion? Code doesn't care about your feelings, you SJW. Why can't you be objective and rational? You should be talking about algorithms or arguing why Python 3 sucks."

Why is diversity important? Let me tell you a story that takes place before the story of how I learned to program.

In 1985, Nintendo released the 8-bit Nintendo Entertainment System and Super Mario Brothers. This was the first video game I ever played and it directly lead to me wanting to learn how to program. But there were earlier video games.

In 1972, Atari released Pong which became immensely popular and introduced a wide audience to video games. Lots of companies started making their own knock offs of Pong with their own hardware. Atari released their own home console which had hundreds of game titles.

One of which was ET in 1983. This was based on the ET movie which was very popular at the time. But the game was rushed. The developers only had five and half weeks to make it so it could be released in time for Christmas. When it came out, people hated it. It was confusing, you weren't sure what you were supposed to do or where to go, the controls were bad, and it was just... boring. Compared to the beloved movie, the game was huge disappointment.

Not only that, ET is often cited as one of the worst video games of all time and one of the biggest commercial failures in video game history. And this was at a bad time for video games as an industry.

With so many companies making cheap knock offs, consumers were getting tired of expensive hardware and lousy games. Revenue for the industry fell from a peak of $3.2 billion in 1983, to $100 million two years later. That was a 97% drop for the entire industry. This was the Video Game Crash of 1983.

Arguably, what saved video games in the 1980s was Nintendo.

Microprocessors were more powerful by then and they could support better graphics, but Nintendo's success was so much more than that. It's a bit of a game design cliche to explain how the first level of Mario teaches you how to play Mario, but it's such a great story I have to go into it.

Okay, when you start playing, Mario begins on the left facing right. This makes it obvious that you're suppose to go right. Every Mario level is designed to go from left to right, so you always know what you're supposed to do to make progress.

Now you soon find a flashing question mark block, which grabs your interest. You want to go to it but then you encounter your first enemy, an angry-looking mushroom with feet. It's just one, this isn't a huge challenge. If you run into the angry mushroom, you die but that's fine because you start again at the beginning and haven't lost much ground.

Shortly after you hit another question block and a mushroom pops out of it. This mushroom moves right, falls off the ledge, bounces off the green pipe and changes direction towards Mario. This means you're almost guaranteed to run into it, even if you confuse it for another enemy that you're supposed to avoid. When Mario touches it, he grows bigger and more powerful, so you can tell it's a powerup. This one event teaches you a lot about how objects in this world behave: there are power ups, they move on their own, they are affected by gravity, they can collide with objects and change direction.

Next you jump over that green pipe, and then there's a slightly taller green pipe after it. In Super Mario Brothers, the longer you hold down the jump button, the higher Mario jumps. Then there's a third pipe that's even taller than the second pipe. It's fine if you mess up, the pipe isn't going anywhere. This lets you practice high jumping until you've mastered it and gotten over the third pipe. Which is good because right after the green pipe is a bottomless pit. You had practice, now comes the test. If you jump too early, you'll end up falling into the pit instead of landing on the other side.

So to keep that from happening, there's a hidden powerup block that you run into which makes you fall back down immediately instead of into the bottomless pit. Out pops another powerup that gives you an extra life. This not only keeps you from making a bad jump, but it teaches you that there are secret hidden blocks that you can run into in this game. There's tons of stuff like this in the first level. You didn't see this kind of careful design in Atari games.

The main game designer behind Mario and other classic Nintendo games is Shigeru Miyamoto. There's a great quote from an interview of his. He goes...

"Well, early on, the people who made video games, they were technologists, they were programmers, they were hardware designers. But I wasn't. I was a designer, I studied industrial design. I was an artist, I drew pictures."

The person who is arguably responsible for much of video gaming's comeback in the 1980s and made me and many others into programmers, wasn't a programmer. He didn't devise new algorithms or write a compiler or anything like that. This non-programmer's impact on the programming world is immeasurable.

Diversity is important, because different people bring different skills and perspectives. They'll also have different needs which necessitate a variety of invention. The world itself is a diverse place, and if your community is a monoculture, then it's going to have blindspots and an existential danger of being irrelevant.

The thing about barriers to entry is that they're often invisible. It's easy for us to look around this room and see the people who are here. But it's also easy to forget to look for the people who aren't here. Let me put this way: When Al Sweigart walks into a tech meetup, Al Sweigart see a lot of people who look like Al Sweigart. I'm biracial white and Asian, I'm a cisgender man, I'm straight, I'm able-bodied and I'm in my thirties and I speak English natively and I have a college degree and I have nerdy hobbies. When I tell people I'm a software developer, no one is surprised. I'm a quote-unquote "culture fit".

Of course, there's no rule that explicitly states some kinds of people are welcome and some aren't. But there doesn't need to be one. Exclusion happens anyway.

This is a problem that is both immense and subtle, and I'm sorry that I have to gloss over so many details for a one hour talk. But for folks who are sick of me rambling about wishy-washy feelings, let's go into some hardcore technical chops. Let's be objective and rational and logical, you know like true programmers are.

This is a computer simulation called the Parable of the Polygons by vi hart and nicky case. It's a story of how harmless choices can make a harmful world.

In this simulation are triangles and squares. And triangles and squares don't *dislike* each other, they just have a slight preference for being around people like them. Nothing wrong with that, right? If a shape has less than a third of their neighbors like them, they'll move to a random new place. This doesn't seem so bad. They're not moving away from people who are different, they just have a slight preference to move closer to people like themselves.

When we run this simulation, we end up with a society that is deeply segregated. Okay, at this point, we learn the error of our ways and we decrease that bias from 33% to 0%. Problem solved, right? Well no. When we run the simulation again, nobody moves. There's nothing that pushes for desegregation, so a segregated society stays segregated. If your solution is to say, "I don't even see differences in shapes. There's only one shape and that's polygon." That is not a neutral position to have. It's a position that maintains segregation. It's a pro-segregation stance.

But is that so bad? So what if nerdy programmers just want to hang out with other nerdy programmers? What's the harm? What do we lose? I don't see anything missing.

There is a way to turn a community from segregated to diverse. And that's to have just a slight bias against monoculture. To take active steps towards creating diversity. It doesn't have to be much. If we set the simulation so that shapes want to move if more than 80% of their neighbors are like them,

We completely undo the segregation that previously took place. Now there's also issues of tokenism and assimilation but this is a short talk and I'm almost at the end of it.

One of my nerdy interests is science fiction, specifically the cyberpunk genre of science fiction. Cyberpunk is usually about hackers, and "high tech, low life" kind of settings. Think about The Matrix, or Johnny Mnemonic. Actually that was a terrible movie, just think about The Matrix if you need an example of cyberpunk. (pause) Heh. Keanu Reeves has been in a lot of cyberpunk movies. Anyway.

The cyberpunk genre arguably began with William Gibson's novel Neuromancer in 1984, but my favorite cyberpunk novel is

Snow Crash by Neal Stephenson, published in 1992. Specifically I love this one scene that describes a programmer named Juanita working at a software company that's making a virtual reality version of the world wide web, and specifically she's working on making the faces of the avatars be able to convey emotion and expression. The book goes "But at this phase, the all-male society of bit-heads that made up the power structure of Black Sun Systems said that the face problem was trivial and superficial. It was, of course, nothing more than sexism, the especially virulent type espoused by male techies who sincerely believe that they are too smart to be sexists."

Neal Stephenson was commenting on the real-world tech industry 30 years ago. I think about this line, and I think about this tweet from Jessica McKellar, who was the PyCon Diversity Chair for a number of years, and she shows how PyCon went from having 1% of its talks by women to 40% in five years. Think about how little diversity in the tech industry has changed in the 30 years since Snow Crash, and think of how much can it change in 5 years if we actually take diversity seriously.

I'm just impatient. I take a look at the Mercury 7, the seven astronauts selected by NASA in the 1950s to be the first Americans in space. Notice something all seven have in common with each other? I mean, there's been plenty of astronauts who weren't white men. There's... no reason the Mercury 7 had to all be white men. They were all pilots but there were pilots who weren't white men back even before the 1950s. But I'm sure the American government had reasons for why 7 out of 7 of their astronauts white and male. It's just that those reasons were wrong. And dumb. So I'm impatient. There's no reason this disparity needed to exist in the 1950s in the space program and no reason it needs to exist today in tech. The excuses we give for it are wrong and dumb.

Opportunity and obstacle are not randomly distributed. There is a pattern to privilege. We're not all playing on a level field. Ask anyone, anyone, to describe life and they will you that life is 1) tough and 2) not fair. That's why it falls on us to make our communities more equitable. And I'm impatient. I want to see results. I don't want another 30 years of it being enough to just say, "We're an equal opportunity employer" and have nothing actually change.

Does this mean we have to lower our standards to increase diversity? No. PyCon continues to be awesome, it hasn't gotten worse, it sells out every year. The Python conference you are sitting in right now also sold out of tickets. We haven't lowered standards by making efforts to support and reach out to voices we haven't heard from before. Everyone should learn to code. It's awesome. It's fun as a hobby, you can write little scripts to automate stuff on your computer, and if you stick with it you can even make a career of it. That's something that should be available to everyone.

And we can make it available to everyone.

(breathe) So, that's big picture. Let me leave you with some smaller tips. My first PyCon was 2011 in Atlanta. Lovely town. They have an aquarium with a whale shark. I know, right? And since then I've learned a lot about how to make the most of PyCon.

First, the point of PyCon is to meet people. The talks are great, thank you for coming to mine, but they're also being recorded. Don't feel like you have to cut conversations with people short to make it to every single talk.

When you're chatting with a group of people, you naturally stand in a circle. Instead stand like pacman with an open space, so new people can join the discussion.

I love going to pycon because I see my friends from across the country, but I have to remind myself that I also should spend time talking to new people. Don't spend the entire conference hanging out exclusively with your friends. Or go ahead and hang out with your friends, but also introduce new people to them.

Don't eat alone. Conferences usually have breakfasts and lunches and dinners, go to those. Follow the pycon hashtag on twitter and ask if people are going out to eat somewhere at the end of the day, or organize your own group dinner plans with random strangers standing next to you. That happens all the time.

Business cards! You remembered to bring business cards, right? That's fine if you didn't, just make one in a text file on your phone's screen and let the other person take a photo of it. But when do you get business cards, order the minimum amount. I know it's cheaper if you buy 50,000 cards, but trust me, just order the minimum amount. Make sure it's white and not glossy so people can write on it. And only have public information about yourself. Somebody should be able to take a photo of your business card and post it on the internet, and you're fine with it. I don't have my home address or even phone number on my card. So I don't care who ends up with it. After receiving business cards, write down the circumstances that you met them. And when you get a business card from someone, quickly write down the circumstances and what you talked about with the person, so you'll remember it later.

And do follow up. Twitter is great for this. You don't have to post to Twitter, just sign up for an account and you can just follow people and casually reply to random stuff they talk about.

This is a last minute slide addition: wash your hands from time to time throughout the day. No reason I'm bringing this up. This has always been good advice at all conferences. Just... wash your hands.

And finally, there's the shirt color trick. You're in a room full of strangers, and you want to pick someone to talk to. Here's what you do. Think of a color. Find the closest person wearing a shirt of that color. Talk to that person.

Don't think about it, just go by shirt color, the more you think about it, the more your internal biases will choose who you do and do not speak to. Don't think about who you might get along with or who you might have stuff in common with. No, talk to someone truly random. Because when we talk to everyone, we include everyone. I hope you enjoy PyTenneessee 2020. Thank you very much for having me.

 

Learn to program with my books for beginners, free under a Creative Commons license:

Take my Automate the Boring Stuff with Python online Udemy course. Use this link to apply a 60% discount.