Category Archives: Uncategorized
Trashy reality tv with a sci fi twist – what could be better…turns out, pretty much anything.
I had the opportunity to review the show “Opposite Worlds” on the channel formerly known as “Sci Fi” and now known as SyFy – television for people who can’t spell phonetically.
Let me assure you – this is one of those times when you really only want to read the review. So you should! Check out my review at happynicetimepeople.com
I have previously alluded to my love of all things trashy reality TV.
Well, this week I get to prove it, by reviewing the almost premiere of the amazingly absurd “Kim of Queens”. If you have not yet seen the amazing Kim Gravel attempt to teach girls to “become ladies”, you should check out my review and then schedule your live tweet for next week’s episode.
…might not realize that Janet Evanovich is one of my favorite authors. Evanovich’s books are fantastically funny, well written with outlandish, yet somehow believably funny.
And I got a chance to review her latest book “Takedown Twenty” for my friends over at HappyNiceTimePeople!
This week I had the opportunity to write a review of Malcolm Gladwell’s latest book, “David and Goliath”, for a very funny website: HappyNiceTimePeople.com. It’s a very different style from what I use here, and it’s a little risque!
Read it here, at your own risk!
I had the opportunity to review Dave Eggers latest book, “The Circle” for a website that I love: happynicetimepeople.com. The style is very different than I use here and it’s a little risque, but for those not for the faint of heart, I bring you a semi-snarky review of “The Circle”.
“The Circle” was a an interesting journey into the near future, and makes you wonder – should we worry that google has part ownership in the first quantum computer?
Update! Or that they bought the creators of Big Dog, Boston Dynamics?!
Have you noticed that lately, there has been a debate ripping at the fabric of our society?
No, not politics: operating systems. Like the infamous Cola Wars, sides have been chosen. Debates rage about which one is better. But, far more important than the question of which is better, is the question – what about the children? Seriously, why are our schools taking sides in this corporate and technological debate? A number of schools require students not just to acquire a computer, but require their students to buy specific brands of computer or tablet. The most egregious examples revolve around Apple’s MacBook, but this practice also extends to tablets, both drawing tablets like Wacoal and multipurpose tablets like the iPad.
On first blush this may seem reasonable. Schools make choices about brands all the time, for students in the form of textbook choices, to a school-wide level in the form of multimedia display system, but the problem is more fundamental than the question of endorsement. At its root, this is a problem of technological choice. As users and consumers, when we make those choices, we solidify or change the trajectory of technologies.
Technologies are not inevitable: the way we use a technology can have a real impact on how it develops. One example of this can be found in the bicycle. Originally, bicycles were thought of as a daredevil sport, but as more people wanted to use the bicycle for transportation, innovations changed the bicycle from the traditional penny farthing design, to a design with pneumatic wheels of equal sizes. These changes made the bicycle a viable form of transportation, incorporating a wider user base than the young racers. But, this moment of technological flexibility is fleeting. Technologies solidify and when they do, they lock in the value judgments of those users who are engaged and vocal about their requirements for use.
This is the root problem of the operating system wars, and the role schools are playing when they participate in them. The personal computing industry has been developing for approximately 30 years, but it is difficult to say that the market is mature; given that prior to 1997 less than a quarter of US households had a computer. Since then that number has climbed to 75% of households. So, more realistically, the personal computing industry has been in a period of significant growth for 16 years. When compared to the automobile industry, which began its first period of significant growth after the release of the Model T, we are comparing the personal computers of our time with the cars of 1929. The question then becomes: would it have been appropriate in 1929 to teach students to drive only one specific model of car? Would that knowledge have held them in good stead throughout the maturation of the automobile? Given that the number of automobile manufactures in the US dropped from 253 in 1908 to our current 3 major manufactures, I doubt it.
This problem is not limited to personal computers, because it is really a problem of teaching a specific piece of software, not teaching a global skill. In a graphic design course, is it really appropriate to teach students how to use Photoshop? Or, should they be required to illustrate the skill, regardless of what piece of software they use to create it? In a photography class, we don’t teach students to use a Cannon, we teach them photography. In music, we don’t teach students to play a Fender, we teach them to play guitar. This is a significant cultural shift in how we teach students to engage with the world and yet, we appear to be satisfied with the explanation that it is easier to teach everyone using the same technological framework.
In part, I think the reason that we are prepared to accept such a simplistic answer is because of a profound misunderstanding of the maturity of the current industry. While it may appear to be maturing, it is in fact, stagnating. With only two large operating systems available, sold by companies with very different approaches, to be satisfied with either is to accept that this industry has matured and that there isn’t room for alternative approaches to compete. And, the more we lock students into a choice between one or the other, the less likely it is that there will be an alternative that can compete.
This argument is equally applicable to Microsoft’s pre-installation agreements. Unlike Apple, who controls their entire product, Microsoft bundles their operating system into the computer you purchase from a separate entity, the manufacturer. This illustrates the difference between the two companies. Apple is, ostensibly, manufacturing a single product, while Microsoft is a software supplier with a Machiavellian spider web of licensing agreements. This alone indicates a significant problem with the argument that the personal computing field has reached a level of maturity that justifies cementing two major players into the field’s future development. Even comparing the two major players in the field is the equivalent of confusing a car manufacturer with a company that builds only engines. What becomes troubling is that in allowing and perpetuating the idea that there is no third way, we are almost guaranteed to force a nascent industry to conform to a status quo that developed absent of critical assessment, inhibiting legitimate competition. In a country that has 145 manufactures dedicated toilet paper and related sanitary paper products, is there not a market for more than two commercial operating systems?
Instead of accepting this status quo, we should be taking a cultural stand against it, encouraging users to experiment with different software suites. Our educators and employers should accept work in different software formats – or expect work to be submitted in a format independent of the hardware and software used in its creation, like the portable document format (pdf files). Instead of being drafted into a battle between giants, we should be staging a revolution. We need to lay down our arms in the Operating System Wars, and join together to demand more from the personal computing industry.
Last week I had the opportunity to speak about “Spectacular Spectacles” at MCAD. I chose to look at software failure as an example of spectacle. Software often fails spectacularly, which corresponds well with the theme, but there is also an interesting argument to be made about how software fails. Software failures are often a result of the most mundane of mistakes.
One such spectacular failure is the explosion of the Ariane 5 Rocket.
The rocket was part of a family of rockets intended to carry a payload into orbit.
The specific payload that the Ariane 5 was supposed to launch was a three-ton satellite.
A $7 billion joint venture of the European Space Agency (ESA) and the Centre national d’études spatiales (CNES), the rocket took ten years to complete. It was intended to give Europe an edge in the competitive, private space industry.
On its maiden voyage, the unmanned rocket exploded just forty five seconds after lift-off from French Guiana.
The cause of the failure was a software error in the Inertial Reference System. Worse, the failure is a frequent mistake made in coding:
When programming languages store data, they store it as a particular type of data. Integers, alphanumeric strings, and Boolean variables are examples of data types.
Numbers can be stored in a variety of ways, but in the specific case of the Ariane 5 there was an error in the way a particular number was being stored. The inertial guidance system used an integer type to store the horizontal velocity. This had worked perfectly well in the previous Ariane 4 rockets. But, the Ariane 5 was faster than its predecessors. *The horizontal velocity exceeded 32,767 – the largest number that should be stored as an integer. Instead, the value should have been stored as a floating point value. But, floating point values are less efficient than integers, and since the system was not mission critical, the integer value was overlooked.
This caused an error in the Inertial Reference System. That error message was then interpreted by the on board computer as flight data, at which point it altered its course. A part of the rocket (the launcher) began to disintegrate because of the steep angle of flight, and that disintegration triggered the self destruct mechanism.
This isn’t an esoteric error in computing: typed variables and the limitations of each type of variable is a first year programming topic at any university. This is certainly not a dig either, the software for this rocket was sophisticated code that had developed over the course of a decade. Moreover, even if the error had been noticed, this is the metaphorical equivalent of your house falling down around you because you accidentally slammed a door. But it is remarkable, that such a mundane mistake can have such spectacular consequences.
And this leads us to the point of this post: software fails; it fails frequently and for easily understandable reasons. The inability to verify that software works as intended has been one of the foremost research topics in computer science since the mid 1960s. While there has been significant progress, it is still a challenge for the field.
For the public, it should be becoming a greater concern as modern, western society becomes more and more dependent upon computers and the software that drives them.
*More information on data types
Lions, J.L. et. al. (1996) ARIANE 5 Flight 501 Failure: Report by the Inquiry Board, Paris
Gleick (1996, Dec. 1) A Bug and a Crash, Sometimes a Bug is more than a Nuisance, New York Times
Have you been following the new Mars rover, Curiosity, as it explores our nearest planetary neighbor? Well then, you won’t want to miss out on the rover’s first use of its complex, on-board, chemistry lab!
Rover Gets Set to Scoop, Credit: NASA/JPL-Caltech
The lab is central to Curiosity’s mission, because the mobile lab will allow NASA scientists to figure out if this region of Mars could have ever sustained life. There are a lot of different environmental conditions that have to be present for a planet to have life – even tiny, microscopic life. According to NASA the fundamental requirements for life as we know it are liquid (not frozen) water, organic compounds and an energy source for metabolism (all the chemical reactions that organisms go through). So, Mars would have needed to be a lot warmer to sustain life. And, its current atmosphere is too thin for liquid water to exist on the surface of the planet. But, there is evidence that there once was running water on Mars.
And so, Curiosity is going to use its robotic arm to scoop up a little bit of Martian sand and dust and then test it to find out its chemical composition. Curiosity performs these experiments with three instruments – collectively called the Sample Analysis at Mars instrument suite, better known as SAM.
NASA describes these three instruments: a mass spectrometer, gas chromatograph, and a laser spectrometer. The mass spectrometer separates elements and compounds using their mass. This helps scientists identify them. The gas chromatograph heats soil and rock samples until they vaporize. It then separates these gases into their components for analysis. The laser spectrometer measures the different isotopes of carbon, hydrogen, and oxygen in atmospheric gases. Understanding the ratios of these elements is crucial to the mission because they indicate whether Mars could have supported life.
Curiosity is also going to look at the minerals in the soil using an instrument called the CheMin. The CheMin is an X-ray Diffraction (XRD) instrument that also has X-ray Fluorescence (XRF) capabilities. That data will be sent back to Earth and NASA scientists will analyze it. Because minerals indicate the environmental conditions that existed when they formed, scientists will be able to see if water was involved when the minerals were formed, deposited or altered.
Right now, Curiosity is just in the testing phase – NASA scientists wanted to test the rover’s arm at a sand dune that Curiosity is currently exploring, called Rocknest. The entire analysis will take two to three weeks, before Curiosity begins its 325 foot journey to its next stop: Glenelg Intrigue. This location is unique because it is where three types of Martian terrain meet.
Eventually, Curiosity is heading to Mount Sharp to test the clay at its base. That clay might hold clues to Mars’ past.
A student in my class posted about the impending crisis (possibly current crisis) that will occur when all the COBOL programmers retire. It was a really smart connection to make and it made me think of one of my favorite stories from the history of software.
The COBOL retirement wave is a real problem – in 2006 Computerworld conducted a survey of 352 IT managers and 62% of the respondents were still actively using COBOL.
We don’t teach new programmers how to use COBOL and experienced programmers don’t want to maintain other people’s code, because it’s like trying to navigate the labyrinth of another persons mind – in a different language.
My favorite example of the COBOL dilemma will always be Schwarzenegger in 2008:
The Governator wanted to cut the salaries of state employees because California was in a budget crisis. But, to reconfigure the payroll system would take a minimum of six months. Why? Because (in part) the last round of layoffs had cut California’s part time COBOL programmers!
And that is why you have to love the history of software. Arnold Schwarzenegger thwarted by the 1970s version of Skynet