1979 Plus 40 Years

by Diana

A friend and I were talking about the year 1979.

My friend observed that we were paid less than what others are paid now.  However, we had more money and ability to participate and do activities.  Thinking of this idea, I decided to think about what it is like for someone to buy a computer now without credit cards.  As in 1979, most people had department store credit cards with about a $500 limit and would ask to have their limit increased to $1,000.  This is how my dad was able to buy my first home computer: a TI-99/4.

Even at that time, the concept of working in the computer revolution meant a better society in the future; it did not mean the concept of an information worker who is like someone working in fast food, having to work long hours for low pay, hoping to get food stamps, and trying to keep their head above water without being evicted or becoming homeless due to emergency expenses.

This future of 40 years since 1979 is not the future many of us envisioned.  There is something wrong; even in my undergraduate years, all of us graduated without student debt and were able to drive, go to movies more than once a week, and most importantly, have fun.  It was not a cycle of study, work, and sleep.

What I and Others Expected

We expected that the future would allow all to enjoy benefits of being able to create, explore, recreate, and try something new.

The ability to create like we did in 1979, try half-baked ideas without the permanent social media record keeping, and the ability to fail privately.  There is a benefit to being able to fail privately because when one feels they are always under surveillance, it does create a group think or herd mentality.  An ability to explore privately as when one goes outdoors or develops without an Internet link.  Exploring privately means that the exploration is not a race, it is a drive and positive benefit like exercise.  To play rather than using the term recreate, an informal rather than clinical term.

Play is something that was done a lot in 1979.

It meant we would meet and decide what we wanted to do, whether we wanted to show each other what we did with our Texas Instruments, Ataris, Commodores, Ohio Scientifics, and others.  Also, it meant that coding was not our whole life.  We would have activities other than coding.  We could afford to have many activities rather trying very hard to keep one activity or hobby.

As we witness the development of the low-wage economy, even in the computer field, it becomes apparent that we need to stop this trend.

No one in 1979 would have thought like someone knowing how to program a computer, or use what would become the Internet, or develop a web page as a commodity worker, someone who would work the hardest for the lowest pay and least stable work environment.  What was thought in 1979 was the dream of Adam Osborne, David H. Ahl, Steve Jobs, Steve Wozniak... a vision where your idea would help lift the quality of life higher for others and for yourself.

At one time, the Silicon Valley actually included those who designed, built, and marketed new ideas and computers, very different from an area where the livability index is way high and prototype and building is done by IT production workers in other countries who have mental health issues while making a computer or table that cost $2.50 in production and is sold in the U.S. or other countries for $2,500.

The original spirit of the computer revolution that existed in 1979 still exists in open-source, other forums, and in universities that allow for students to explore half-baked ideas, and fail privately without a permanent log on social media or server farms.  It is also comprised of those who still speak of social issues like older magazines such as Creative Computing and Byte used to do, along with code.

Computer science was very different when I started in my undergraduate year.

It meant strictly business computing, no AI, no half-baked ideas, no real-time running of code.  Computer science then, which included parts of computer engineering, meant learning how to make your CPU from chips.  For us, it meant making a four-bit CPU using MSI and VLSI chips like 7400 and 74138, along with a clock chip.

Even at my old university, I don't see much of the actual hardware balance along with software like when I started in 1982, majoring in pre-med and applied computer science with a breadth of knowledge in art and history.

In a way, it was a joy to see the old computer display in the building where computer and engineering sciences were together.  Yet, it's a bittersweet memory, and I wonder if the current students would know what I was talking about when I say one project was to built a four-bit CPU from 74xx- and 74xxx-series chips.  Would the student be able to complete the same project today?  I hope they could.

In the world of the information worker, when they lose the coveted position at a major firm, they become the worker who is faceless in the commodity work of the information worker - no matter where they work.

However, the perks of having two cars, their own house, and being able to really play on the weekend disappear after they reach age 35 or 40.  Remember the $65,000 student debt for an undergraduate computer science degree.  So, from age 23 to 35, a life of expenses must be paid in 14 years whereas my generation had at least until age 50; we would move into another field without becoming a commodity worker, a field where one is a valued individual, compensated fully without having to ask the state for food stamps, health insurance, or help with rent.

Whatever happened to the boycott, like the #MeToo movement and glass ceiling?  The boycott does not work.  One should not have to study harder than a medical doctor and be treated as a commodity worker.

Even when I started to change careers, my parents said they would support me for obtaining an advanced degree, but not in science or technology because they felt the main aspect of how graduates were treated was as a commodity of who would work for the lowest compensation, longest time, and be the ball.  This was in 1998 before the aspects of corporate welfare were fully known.

So, I went for an MBA as well as a doctoral degree in management and organizational behavior to start my own company and to continue the ideas of the computer revolution.  One thing I did not do is seek venture capital because it is a sword and a delicate dance.  Rather than public, set up a company as private.  I see the idea of an Initial Coin Offering (ICO) preferable to an IPO and hope that it will spur the benefits where all live fully and corporate welfare ends.

With regards to the idea of making a four-bit processor, even the concept of a simulation is not taught in computer science.

The theory is, but I prefer code to learn.  An example of the main engine is:

1.  Repeat
2.  Data = getMem(PC);
3.  Opcode = data and $0f;
4.  Case (opcode) of
5.  Pushv: doPushv(PC, data);
6.  Push: doPush(PC,data);
7.  Pop: doPop(PC, data);
8.  Call: doCall(PC, data);
9.  Cmp: doCompare(PC,Data);
10. JMP: doJump(PC, Data);
11. JNE: doJumpNE(PC,Data);
12. JEQ: doJumpEQ(PC,Data);
13. JLT: doJLT(PC,Data);
14. JGT: doJumpGT(PC, Data);
15. Skip: doSkip(PC, Data);
16. Ret: doReturn(PC, data);
17. End case;
18. Until (opCocde = haltIns)
      or (pgmStop);

What is shown is a basic CPU simulator that supports the concepts of a program as sequence, control, and interaction.

We need to think about the future the way we did in 1979, not in what has become 1879 to many IT workers worldwide.

This is the 21st century.

We have to stop backsliding and once again start pushing the future in a positive way for all, so no underclass or people are treated as a commodity.

All should be treated with full dignity and a full quality of life.

That full quality of life is not a perk.  It is a right.

Return to $2600 Index