I’m looking into new JS frameworks for frontend enhancements, and came across MooTools on recommendation from a friend. The framework has lots of cool features and seems pretty simple, but one thing I’m most impressed with is the download page. It acts just like the *nix package managers – allowing to you pick and choose which modules you’d like to download – even auto-selecting dependencies. Definitely a novel idea for the web.
I just received word that one of my projects just won a Davey Award for online marketing and email campaigns:
Munchkin’s Project Pink: Email a Duck, Raise a Buck!
Harley Bergsma at the UXB and I devised this brainchild together. From there he took care of project management and I took care of developing the back-end. The idea is that users can decorate their own pink duck on the site (using Flash), then forward their creation to all their friends. For each person that receives the duck and opens (don’t even have to click links!) the email five cents is donated to Susan G Komen for the Cure. One of the coolest things is that the email dynamically shows how much the duck raised (how many emails were sent) and how much the project’s running total was. If you were to come back to the email a few minutes, days, weeks later, it would continue to show fully-updated information. Rad!
In just a couple months there were over 131,000 forwards, for a total of over $6550 raised just from this part of the campaign alone.
- May 4, 2006
I recently read an article in The Register about Google’s recent issues with the massive amounts of spam online. Let’s just be honest moment here: SEO is great but it’s exploited WAY too much on what I would consider illegitimate websites. You know – the ones whose sole purpose is for the display of advertisements, portals, etc.
Here’s an idea: Let’s apply a Web 2.0 ideology to our searches in much the same way we see sites like Slashdot, Digg, and Newsvine moderate reader comments. One crucial difference: we don’t necessarily elevate page rankings for any given page – that task is to be completed as it’s always been done. Rather, those irrelevant sites should probably suffer de-listing if enough users give it a thumbs-down based on the nature of the content. Before you begin explaining the counter-arguments, let me save you some keystrokes. What happens if users begin thumbing-down legitimate sites based on their personal viewpoints and beliefs, on the basis of competition, or other that’s-not-the-point-of-this-self-regulating-system reason? Maybe one way around it is to flag the page (or even domain) for review by a select group of people. People could first earn the title of moderator, then even earn rewards for proper moderation, as reviewed by peers (those who want to be moderators?). Either way, it’s a community approach to regulating the quality of search results.
…just ideas I had while reading the article. I’m tired of all the junk on the net and in my email.
- March 2, 2006
Well folks, Newsvine is finally out of private beta!
It’s a great spin on aggregating news over the net. First, it claims to put wire news up faster than any other site because it skips all the editorial processes that most news websites (think CNN) go through before anything makes a page. Thousands of articles are instantly available from the AP, ESPN, and others.
Second, users have the chance to seed articles. In other words, you can point other readers to news elsewhere on the net that you find particularly interesting or worthwhile.
Third, it allows users to submit their own articles – much like a type of blog. But it’s more than just a blog. Consider it a place to write news articles and editorial pieces for others to read. Gain a following. Earn money through advertising click-throughs.
One thing that really makes this site stand-out is the amount of news on it… and it’s well-organized. Really well organized. Users also have the opportunity to vote for articles, pushing them up the page. In other words, the more popular content gets more exposure.
Check it out! Newsvine
Today really was my day at ApacheCon. Four of the five talks were on things I’m truly interested in – mostly PHP (see previous post). Rasmus gave an interesting talk about using PHP at Yahoo!. He gave some particulars about making high-performance, scalable systems. The other portion of his talk focused around XML support in PHP 5, as well as SOAP and REST services at Yahoo! (including a pretty cool Yahoo! Maps demonstration). There’s a similar demo on his toys blog: http://toys.lerdorf.com/. There were times, however, when he went a little to deep into the details, though I don’t think they detracted from the quality of the talk.
There was another good talk called “Consuming Web Services using PHP 5” by Adam Trachtenberg (eBay). For the amount of time allotted I think it was a pretty good discussion on what to expect when working on REST and SOAP clients.
Scalable Web Architectures: decent. It’s one of those that really got me thinking about how German and I are going to design the DFL system (fewer hits, but extremely high bandwidth per user).
Now for the fun part of this post: Ruby on Rails (RoR). “Cheap, fast, and Good. You can have it all with Ruby on Rails.” It seems like every RoR demonstration I’ve seen fails to really capture a whole lot of attention from the average web developer, including this one. When the presenter, who I believe is one of the main developers of RoR, says that a lot of it is “magic” that scares him because he doesn’t really know what’s going on, what are we supposed to think? Yeah – it’s great that they can make these easy to install frameworks, but you can’t deny that some amount of programming has to go into developing the framework, and after that, the consumer developers still have to figure it all out (or in many cases, practice some kind of voodoo automagical programming methods). Put it this way – it didn’t seem like a lot of those people were very excited after the talk. It appears RoR will remain a novelty for some time to come.