Thursday, July 24, 2008

BaconMarathon was at F8 yesterday...

...and here are the take aways.

The BaconMarathon opinion of the state of the Facebook Partner Platform:

Year 1 = a ton of abusive applications taking advantage of the fb tools that were provided

Year 2 = will be about creating engaging 'real' applications by playing by the rules to build user trust and focusing on engagement

Year 3 = making money. So much talk about needed payment systems, etc.

The new Facebook user experience is all about the 'news feeds'. When we started top3Clicks the news feed was really the 'third' option behind the emails and notifications for getting the word out to friends on what is going on.

In the business track, Mark Pinkus and panel had some great advice for the young entrepreneurs:

1. If you are building an app on Facebook today you are way ahead of everyone out there today.

2. F8 feels like the first Internet World back in 1995 (we were there and agree!)

3. If you go to Yahoo's front page there will probably be an fb app for every link on that page by next year.

4. Make something engaging and use the social graph as a 'tool' to achieve viral-ness. don't make something viral then figure out engagement.

5. Good VC's focus on the team first and the idea second.

6. Ship often and learn <-- something we learned many many years ago.

Wednesday, July 2, 2008

Great EC2/S3 Tool

Tim Kay of ActiveBuddy fame has been working on aws, which he says provides “simple access to Amazon EC2 and S3”. I ran into Tim a while ago and we chatted about aws – around the same time, I also heard about it from Mike.

As all the BaconMarathon apps are running on EC2 and using S3 for storage, I’ve been looking for an easier way to interface with S3 in particular. I finally got around to trying out aws the other day and found it be more than advertised. Finally, I had one simple command to examine buckets on S3:

aws ls

That was it. Creating buckets, puts and gets are just as simple:

aws mkdir BUCKET
aws put BUCKET/[OBJECT] [FILE]
aws get BUCKET

We’re now using aws to push our MySQL backups to S3 on a scheduled basis – a simple change to our existing backup jobs