Entries Tagged 'trends' ↓

The Mashup Is Dead

Today I want to rant about a few things I hate. They include:

  • The Word “Mashup”
  • Proclamations of the form: “A Thing is Dead; Long Live that Thing”
  • People Who Insist on Continuing to Use the Word Mashup
  • The Term “Web 2.0”

I know it’s heresy. Mashups and Web 2.0 are what’s hot, right? I myself am considered to be a “mashup creator” working with Web 2.0 concepts.

But that era is behind us. The term “Mashup” made sense when coders were actually lifting data from places it was hard to lift from and putting it into contexts that were hard to access. This, my friends, is no longer the state of affairs on the Internet.

Today, we are working with a world of data that wants to be free and is published via countless, well documented API’s. In the cases where API’s are still not available (or whorishly published in hopes of becoming universally adopted), advanced tools and protocols are available to automate what used to be hard.

We must remember that the word “mashup” hails back to music, originally; a talented music editor might string together pieces of previously recorded music to create something new. This was an artform in itself, and implied a kind of subversion. A repurposing of content, often done without the permission or knowledge of the original creator.

Well, the days of this kind of thing on the Internet are, thanks to everybody’s efforts to open things up, largely over. In a world where open source software is widely accepted, where it makes sense for companies like Facebook, Google, Yahoo, Twitter, Amazon (and gee, every other damn company out there) to publish API’s that encourage their data to be woven into the fabric of the net, there is no need for the coy sense of subversion that comes from the word “Mashup.”

What we’ve got now, folks, is DATA! Great flowing rivers of it! Software that helps us use it! Ruby on Rails, Asterisk, MySQL, PGSQL, Apache, Freeswitch, Flex! Where it’s not open source, it’s at least free! Everything has an API and the things that don’t are falling away.

The next person that says to me with a straight face that they “make mashups” is going to get sucker-punched. The word has lost its meaning, so let’s move on.

That said, explaining to a layperson what it is we “creative coders” do, sometimes you, well, have to resort to saying, “I make mashups.” But do us all a favor, try to explain what that really means today. Let’s move to a world where we can think about data, about tools (which is really just code-as-data), and imagining what we can do with it all.

Mashup was a good word for perhaps 2003-2007, but it implies limitations and barriers that simply no longer exist. We can do better.

What would YOU call the innovations that are possible with all the data and tools we have today?

Hacking Freerice.com: A Program to Feed the World

While I was working on some changes to Twittervision yesterday, I saw someone mention freerice.com, a site where you can go quiz yourself on vocabulary words and help feed the world. How? Each word you get right gives 10 grains of rice to, one hopes, someone who needs it.

The idea is that you will sit there for hours and look at the advertising from the do-gooder multinationals who sponsor it. Which I did for a while. I got up to level 44 or so and got to feeling pretty good about Toshiba and Macy’s.

It occurred to me though that my computer could also play this game, and has a much better memory for words than I do. In fact, once it learns something, it always chooses the right answer.

So I wrote a program to play the freerice.com vocabulary game. In parallel. 50 browsers at a time. Sharing what they learn with each other. Cumulatively.

It’s a multithreaded Ruby program using WWW::Mechanize and Hpricot. Nothing terribly fancy, but it does learn from each right and wrong answer, and after just a few minutes seems to hit a stride of about 75-80% accuracy. And a rate of about 200,000 grains of rice per hour (depending on the speed of your connection).

UPDATE: With some tuning, the script is now able to push out about 600,000 grains of rice per hour, which according to the statistic of 20,000 grains per person per day, is enough to feed over 720 people per day! If one thousand people run this script, it will (allegedly) generate enough to feed 720,000 people per day.

Before you go off on me, disclaimer: Yes, I realize this program subverts the intent of the freerice.com site. I’ve released this not to “game” freerice.com but simply to show a flaw in their design and have a little fun at the same time. If what they are after is human interaction, this design doesn’t mandate it. That’s all I’m saying.

Run it for a while and see how many people you can feed!

Prerequisites:

  • Ruby (Linux, OS X, Other)
  • Rubygems
  • gem install mechanize –include-dependencies

Download the code

18 Months Windows-Free (Nearly)

I’m Dave and I am a former Windows user.

Not that I ever liked it. Back in the day, I used Atari 8-bit and 16-bit 68000 computers. The Atari ST machines were cool because you could run Mac programs on them with the help of the Spectre GCR Mac emulator, and the native Atari programs (like PageStream and Calamus) were actually pretty very good themselves. Power without the price. Stickin’ it to the man never felt so good… we had the best of both worlds.

Around 1994, as I was also getting into Linux, I started to use Windows as my primary desktop UI. It sucked, but at least back in those days (Windows for Workgroups 3.11) you knew how it sucked and why. And in general you could work around it. It was lightweight enough to be manageable.

Back when I ran an ISP, I developed a bunch of software using Microsoft SQL Server, ASP, Access, and other relatively common, garden-variety tools of the day. It got the job done and I was happy enough.

During the Mac’s PowerPC years, I always found the Mac to be needlessly obscure and imperious; its choice of the PowerPC architecture, while admirable from a performance standpoint, just made very little sense in terms of interfacing with the rest of the world.

The Web hadn’t really emerged as a viable application development platform at that point, and the Mac was pointlessly obscure in the face of Windows. Everything was available for Windows, and the Mac was precious, delicate, and oh-so-special. I wasn’t interested, despite my respect for the platform.

Around December 2004 I succumbed and bought an iBook G4, a PowerPC machine. As a software developer I was curious about how OS X was coming along so I thought it would be cool to have a current Mac.

When in early 2006, Steve announced they would be switching to Intel chips, I felt a nearly religious change of heart towards Apple, or that Steve had one towards me. The implications were obvious: the long freeze was over. Mac would become Intel friendly, and Intel-friendly OS’s like Linux and Windows were suddenly going to be a possibility on the Mac. Yeah, I am aware that there were ways to run Linux and Windows on PPC, but it was hard (and obscure). I’m all about ubiquity and reaching for things that can be done on a huge platform.

I went out and bought a Mac Mini Core Duo shortly after and have never looked back. While I’m writing this on my old decrepit iBook G4, I also own a MacBook Pro, a MacBook, a Mac Pro, a MacMini, an iMac, an iPhone, and two iPods. I am a certiifiable Apple Fanboy, though I try hard to hide it (and mitigate it).

I still use Windows to run Quickbooks and Quicken, and the occasional odd program (like the Nokia phone firmware updater). It seems it can’t be easily avoided. The Mac versions of both Quickbooks and Quicken are crimes against humanity, though the Windows versions aren’t much better. No matter, home is where the heart is, and I must say that to finally be using a decent OS on decent hardware on a regular basis is truly bliss.

Now I read reports about Microsoft’s dominance in the OS space and I just shrug and think “yeah I guess”, while I myself have been shielded from the tyranny for nearly 2 years… Now when I run Windows, I look at it as some outmoded form of existence that I revisit now and again for nostalgia.

Don’t even talk to me about Vista.

Last February, upon its release, I went out and bought a copy, thinking that as a technologist, I should know what it does and doesn’t do. As an optimist, I figured it had to have some redeeming qualities. After loading it on my PC, I can say it was a pointless exercise bordering on utter disaster.

I wanted to “experience” the Aero-glass features, so I bought a new $175 video card. I bought a new $200 300Gb hard drive so I could install Vista without imperiling my old XP installation. This was all a huge mistake. I ended up with my XP installed as a secondary drive, and a bunch of programs that wouldn’t run. Accept or Deny?

Then my Vista boot drive died, and the whole thing ended up as a pile on the floor with a Knoppix Live-CD stuffed in the DVD drive, acting like a life-raft on the Titanic, trying to tar+scp things off onto whatevner machine would take it. If I had a gun, I’d shoot the thing. It is dead, and Vista killed it as far as I am concerned.

Now I keep my virtual machines on an external USB drive I can carry between my MacPro or my MacBook Pro (depending on where I am) and I am a lot happier.

Bless you Steve, for finally coming around to Intel. It may not always be better, but at least it’s what everybody else is using. Now when I hear about the latest stupid ideas from Microsoft, I can just shrug them off, secure in the knowledge that a) my Mac will work great, and b) I can run Linux or BSD or Solaris to do anything else.

And I can even know that I can run Windows, if I absolutely must. For creative professionals (and by this I include everybody from artists to coders to database guys), the Mac is truly a gift to you. Enjoy it, appreciate it. If you are still on Windows or forcing yourself to use a Linux UI for ideological pride, it’s time to move.

Anyone with a creative bone in their body should be using today’s Macs.

MoMA NY Selects Twittervision & Flickrvision

Yesterday, I received final confirmation that the Museum of Modern Art in New York has selected my mash-ups twittervision.com and flickrvision.com for its 2008 exhibition Design and the Elastic Mind.

I’m certainly very flattered to be included and have never considered myself to be an artist. I didn’t seek out MoMA on this. I am just very, very happy to have an opportunity to participate in a small way in the ongoing dialog about what technology means for humanity. Crap. Now I sound like an artist.

Incidentally, this means that twittervision.com and flickrvision.com are the first ever Ruby On Rails apps to be included in a major art exhibition. I already told DHH.

Anyway, at RailsConf Europe a few weeks ago, Dave Thomas’ keynote speech emphasized the role of software designers as artists. He said, “treat your projects as though they are artworks, and sign your name to them.” Or pretty close to it. I think this is incredibly valuable advice for software designers today.

We’re past the days of using machines as amplifiers of our physical efforts. It’s not enough to jam more features into code just so we can eliminate one more position on the assembly line. We’re at a point where the machines can help amplify our imaginations.

Today, creativity and imagination (what some folks are calling the right brain) are becoming the key drivers of software and design. With imagination, we can see around the corners of today’s most pressing challenges. While technical skill is certainly valuable, if it’s applied to the wrong problems, it’s wasted effort.

Creativity, imagination, and artistry help us identify the areas where we should put our efforts. They help us see things in new ways.

Everywhere I turn (perhaps partly because I am a Rubyist), I hear discussions of Domain Specific Languages, and of framing our problems in the right grammars.

This is hugely valuable because the creative part of our brain thinks in terms of semantics, grammars, and symbols. If we can’t get the words right, our imaginations can’t engage.

Everything stays stuck in the left side of our brains when we have to jump through hoops to please some particular language or development environment.

I hope you all will come out to see Design and the Elastic Mind when it opens at NYC MoMA, Feb 24 – May 12 2008. I’m not sure how we’re going to present the sites but we’re going to see if we can get some partners and sponsors involved to do something really beautiful.

And again, thanks to MoMA for the selection. And here’s to creativity, imagination, and artistry as the next big thing in software design!