Women In Tech

Many people find Apple, Google, Facebook, and other tech company’s statements about the desire to hire more women disturbing, as if it’s discrimination against men. In a growing industry like tech with enough spots for the takers, hiring women first is still going to result in a ~70% male workforce. It’s not a solution, and it’s not exactly what diversity reports intend on conveying.

The main aspect of why more men go into computer science than women is early education and societal gender roles. The gender roles that previously prevented women from wanting to become doctors are still rampant in tech. It’s a boy’s club. Thankfully, there are some programs right now targeted at teen women to get them interested in tech for the next generation of the workforce. Historically, the industry was actually less male dominant. Today, 82% of CS degrees go to men. In 1985, that number was 63%. Computers became an industry without association with masculinity, so many of the influential early pioneers, such as as Grace Hopper and Ada Lovelace, were women.

Brenda D. Frink of The Clayman Institute for Gender Research writes:

As late as the 1960s many people perceived computer programming as a natural career choice for savvy young women. Even the trend-spotters at Cosmopolitan Magazine urged their fashionable female readership to consider careers in programming. In an article titled “The Computer Girls,” the magazine described the field as offering better job opportunities for women than many other professional careers. As computer scientist Dr. Grace Hopper told a reporter, programming was “just like planning a dinner. You have to plan ahead and schedule everything so that it’s ready when you need it…. Women are ‘naturals’ at computer programming.” James Adams, the director of education for the Association for Computing Machinery, agreed: “I don’t know of any other field, outside of teaching, where there’s as much opportunity for a woman.”

The shift happened once computers became more popularized. Computers were more often given to boys once they became a household item, seen the same way in parents eyes as telescopes and other typically “boy” items. CS students went from being entirely beginners, to professors assuming students already had experience. Of course, this gave an advantage to men. NPR delves deeper into this history.

Progress in Equality

You might ask why this matters. Assuming someone should or shouldn’t inherently like something or want to do or become something because of their gender and sex has a negative result for everyone. It prevents great minds from being expressed because it’s against the social norm for them to be one. Gender roles are improving, and this will create a more respectful and less assuming society. It means getting rid of the “Don’t ask me, I’m just a girl!” idea that The Simpsons critiqued in the 1994 episode “Lisa vs Malibu Stacy”. The only reason more women aren’t in tech and politics is because of these antiquated cultural norms.

Capitalistic Representation

The vast majority of the richest self made people are white men. These people hold a lot of power with their money. The Koch brothers and their billion dollars in donations towards to the GOP for 2016 alone, for example. These billions find their way to restricting abortion access and reducing healthcare contraceptive coverage. Billionaires, the concept of saviors, and wealth inequality isn’t going to go away anytime soon. The best bet is to make the rich and powerful as diverse as possible in attempt to keep at least a some representation towards the less represented. How can we consider our system democracy when a 31% demographic holds 65% of the power. The billionaires that puppet those elected officials are even less representative, with 85% of them being white males.

Tim Cook, Larry Page, and many other executives are probably not at all sexist, but they don’t have the same personal experiences that would make them use their billions in ways that women would.

“2K”

After the term “4K” became common on UHD TV’s, many phone, tablet and laptop reviews have started to refer to 1440p, like on the Galaxy Note 4, as “2K”. 2K sounds a lot bigger than the 1080p term we’ve been using for ages, right?

Except not. Full HD (1920×1080) IS “2K”, and QHD (2560×1440, or four times the size of 1280×720 HD) is NOT. It’s 2.5K. But “K” is a rubbish term for electronics. It’s a film industry term where aspect ratio and resolutions aren’t standardized or exact, so it makes sense. Even if you include 1440p QHD with the 2K term because 2560 is less than 3K, it’s a statement meant to sound impressive that is factually indistinct from the unimpressive norm.

Digital Cinema Initiatives 2K (native resolution) 2048 × 1080 1.90:1 (256:135) ~17:9 2,211,840
DCI 2K (CinemaScope cropped) 2048 × 858 2.39:1 1,755,136
DCI 2K (flat cropped) 1998 × 1080 1.85:1 2,157,840
1080p HDTV 1920 × 1080 1.78:1 (16:9) 2,073,600

Professional video cameras record in 5K or 6K and the video editor crops from there. Consumer electronics are standardized, so it’s nonsense to use “about 4,000 pixels wide, give or take a couple hundred” term for a TV with a standardized resolution of 3,840×2160, and “about 5,000 pixels wide” for an iMac with a resolution of 5120×2880. Just say 2880p, 2160p/UHD, 1440p/QHD, 1080p/FHD, and 720p HD, like we always have in the past. No one is ever going to refer to FHD as 2K because the term 1080p is so ingrained in our minds, so we just need to give up 4K. It’s confusing to use a second term that’s nearly double the previously common term, thanks to the fact it refers to a different axis. It makes people think 4K is 4x 1080p in every direction, or 16x the size, when it’s really only 4x the size.