Who says you can’t change the world? History has repeatedly shown that whatever isn’t possible today can be possible tomorrow.
For one, each day leads to opportunity; a chance to learn something new. Not to mention that technology is a space that is forever evolving, so that opportunity to do something earth-shattering might be just around the corner.
If you’re a coder - or coder-in-training - you’re well on your way to putting yourself in position to seize such opportunity. And to quote Walt Disney, “It’s fun to do the impossible.”
Here are 10 stories and real-life examples of people who put their coding in action...to quite literally change the world.
1. Conquering Space
Margaret Hamilton was a young, plucky, 33-year old woman with glasses who liked math. She was one of the few women at MIT’s Charles Stark Draper Laboratory, which in the early 1960’s, was developing something important for NASA.
The Apollo and Skylab programs would mark a new era in space exploration. Hamilton led the team that developed the in-flight software used to control the Apollo command module, lunar lander, and eventually, the world’s first space station, Skylab.
At a time when there were no computer science courses or software engineers, Hamilton was self-taught. Developing her skills with hands-on experience, she designed and developed recovery systems and error detection software. Her amazing efforts left fellow team-members dumbfounded.
Without her contributions to the space program, there’s no telling where it might be today. Margaret Hamilton used code to change the world and conquer space.
2. A Window to the World
William Henry Gates III - better known to the world as Bill - was born in October of 1955. By the time he was 13, he was fascinated by computers and took advantage of programs at his school to get access to the school’s computer systems.
He wrote games likes tic-tac-toe in BASIC and later, when he demonstrated his expertise in programming, was enlisted to help a contractor for his school develop payroll programs and class scheduling software.
In his words, it was “hard to tear myself away from a machine at which I could so unambiguously demonstrate success.”
Eventually, he would take his programming knowledge and use it to capitalize on a new technology that was quickly spreading across the globe—personal computers.
His idea was simple: Don’t make the machines, but make the software that runs them.
He went on to form Microsoft. The company would release Microsoft Windows, a graphical interface for MS-DOS on November 20, 1985.
3. From Dungeons and Demons to Steve
Of the many types of video games created over the years, you’re bound to see a few that are a little off-center.
The 1997 Electronic Arts classic Dungeon Keeper is an odd one. An interesting switch on the classic dungeon crawler, the game puts you in the role of a dungeon master - of sorts - placing obstacles and building dungeons to protect your precious creation from invading heroes.
Then there is Dwarf Fortress, a freeware game, with three game modes, all of which take place in worlds created by the player. The game’s Fortress mode is a construction and management sim, while Adventure mode is a turn-based adventure game. The Legends mode showcases player achievements.
Finally, Infiniminer was is a 3D isometric building game, which feels a lot like a familiar block-building game that many kids know and love today.
All three of these titles and their unique gameplay inspired a young programmer named Markus “Notch” Persson to create what is perhaps the most popular block-building game of all time—a title called Minecraft. The game has become a cultural and global phenomenon. And from a coding perspective, it has many educational benefits, and has become a catalyst to help introduce and teach kids programming.
“Notch,” didn’t set out to change the world, but he did in an unexpected way.
4. A Wolf in Search Clothing
“Siri, is Puerto Rico part of the United States?”
If you’ve ever asked a personal assistant a question, you may not have known that its search engine isn’t powered by vast databases from Apple or Google. Instead, the search query is driven by Wolfram Alpha, a powerful “computational knowledge engine” that at its core, is a Wolfram Mathematica code.
Developed by British-American computer scientist Stephen Wolfram, Wolfram Mathematica is a cross platform computational powerhouse that can be used to extract data from almost anything. It also powers Wolfram Alpha, which in turn helps you find out what time your favorite movie is playing at your local cinema, through your favorite personal assistant.
Wolfram Mathematica has helped jumpstart the development of machine learning, neural networks, image processors, data science, and more. It’s used prolifically in computing fields that touch scientific research, engineering, and mathematics.
When Stephen Wolfram created the language, he probably didn’t think it would impact the world as it has, but Wolfram and his code have dramatically changed how we live our lives and the world we live in.
5 & 6. Painting with Pixels
It’s rare that a product becomes so much of a “thing” that it becomes a verb. “Google it” for example, or maybe you’ve had a picture that needed to be “Photoshopped?”
When you reach such a level, you can assure that you have made a significant impact in the world. Thomas and John Knoll (who also developed the story for Rogue One: A Star Wars Story) did just that.
As the creators of Photoshop, the two sought to recreate the same experience they had working in their father’s darkroom. At the time of Photoshop’s launch in 1990, bitmap image editing was already a crowded field. Products like PCPaint, PixelPaint, and PC Paintbrush dominated the space. But what made Photoshop different was it’s amazing ability to embrace computer-based photography and imagery.
Editing high-resolution images takes up at lot of processing power and Photoshop excelled because of the low-level graphics routines the program used. Thomas Knoll developed the code while getting his PhD in image processing.
With the foundation laid, Photoshop went on to become the go-to software for artists, designers, and photographers. The software evolved as technology did, adopting CMYK support, and introducing Photoshop to PC’s and the Windows market with version 2.5. In 1994, Photoshop introduced - wait for it - layer-based image manipulation.
The program now has countless freeware clones online, but still remains the standard in image manipulation—and all because a programmer wanted to make his photos more vibrant.
7. The Revolution Known as WordStar
Early computers were extremely unfriendly. Unlike today, they had no mouse or touchpad control. There were no on-screen fonts or proper spacing of characters, and in fact, none of the interaction we’re familiar with today existed then. Instead, users utilized ASCII characters to create graphics—data input was clunky, at best.
At the time, IBM ruled business operations—with a typewriter. In fact, it was the office standard and of course, it didn’t interface well, or at all, with computers.
Written by Rob Barnaby, the program allowed secretaries to input data directly into the computer using a suitcase-sized, old-school “desktop” machine that featured a tiny green screen and floppy disk drives.
Wordstar introduced real document editing to the blossoming 8-bit CPM computer marketplace, and swept through the business sector like a tornado. It changed how business got done, and how humans would interact with machines. In fact, it changed everything, and still has an impact even today: it’s because of Wordstar that you are reading this or any article on the internet.
8. The Queen of Code
Like most Americans during the time, Grace Hopper wanted to give her all for her country and fight in the big one—World War II. So at 34 years old she tried to enlist in the Navy, but was turned away because she was told that her job as a mathematician and mathematics professor at Vassar College was far more valuable to the war effort.
Undeterred, Grace Hopper did what any young lady with moxie would have done at the time: she got a leave of absence from Vassar and volunteered for the United States Navy Reserve.
After training at the Naval Reserve Midshipmen's School, Hopper graduated first in her class in 1944. She went on to work for the Navy at Harvard University and served on the Mark I computer programming project. She liked the work so much, she remained at the Harvard Computation Lab until 1949, turning down a full professorship at Vassar to work as a research fellow under Navy contract.
Known for her persistence, she spent the 1960’s surrounded by computers. One day, Grace recalls, “I decided data processors ought to be able to write their programs in English, and the computers would translate them into machine code. That was the beginning of COBOL, a computer language for data processors. I could say "Subtract income tax from pay" instead of trying to write that in octal code or using all kinds of symbols.”
It was a watershed event: making computer programming accessible with a programming language that was written in English. It took years for her ideas to be accepted by her peers. But when it was finally adopted, COBOL was the first computer language that used words, not numbers, to run computers. Today COBOL is, as Grace put it, “The major language used today in data processing." (Learn more about kids coding languages.)
9. The Blip That Birthed an Industry
After his success with the arcade game Computer Space, Nolan Bushnell and partner Ted Dabney started a company called Atari. Shortly after founding the company, Bushnell hired Allan Alcorn, a former UC Berkeley Electrical Engineering and Computer Sciences student that now worked at Ampex. Because of his experience in electrical engineering and computer science (and that Bushnell and Dabney had worked with him at Ampex) they thought Alcorn to be a perfect programmer for Bushnell’s next project—a game called PONG.
The first game developed by Atari - a simple two-player electronic version of tennis - PONG became a cultural phenomenon and kicked off the video game industry you know today. (Get your fill of other cool video game facts.)
10. Mashing Faces
In his sophomore year at Harvard, a relatively quiet student worked diligently on a program called “Facemash.” As he nestled into the corner of his dorm room and chipped away at his code, he developed a program that would not only change the world but would change people’s perception of data collection and how people think of personal privacy.
Mark Zuckerberg’s Facemash drew 450 visitors and 22,000 photo views in its first four hours online. The site was soon shut down by Harvard faculty, claiming the site violated copyrights, individual privacy, and security. Ultimately, Harvard dismissed the claims and Zuckerberg was confident he was onto something. He continued the development of the social media platform (the “face book” directory was unlike any online directory that came before it). Eventually the program would be known to the world simply as Facebook.
More recently, Facebook has become a tool for political activism—and controversy. But it still touts millions and millions of users and has set the standard for social media engagement online. Facebook created the roadmap for how social data is captured, and shared, and how it can build a multi-million dollar organization with a global influence.
Writing Code That Can Change the World
You can write code that can change the world, too. It doesn’t have to be the next great social media platform, or best-selling video game…the point is that the opportunity is endless. Computer code is an amazing tool, and code can entertain, educate, inform, or just create something fun and cool.
This summer, kids and teens can attend iD Tech for a life-changing—potentially even world-changing—experience. (View all coding classes for kids.) And even before they head off to camp, or after they're finished, they can go online for more coding instruction!