Lightweight Models and Cost-Effective Scalability

May 14, 2012 4 comments

For my final blog post, I am going to discuss the last pattern of web 2.0 that was defined by Tim O’Reilly. This pattern explains “Lightweight Models & Cost-Effective Scalability”.  O’Reilly defines this as “agile software-development techniques that are ideally suited to support rapid release cycles, so they have a readiness for change. Integrate lightweight development and deployment processes and complements to perpetual beta. Combine this with low-cost, commodity components to build a scalable, fault-tolerant operational base.”

 

A particular website that I think employs this pattern well is Gumtree. This website was founded in the year 2000 as a local London classified ads and community site, which was designed to connect people who were either planning to move, or had just arrived in the city and needed help getting started with accommodation, employment, or even just meeting new people. However, Gumtree now covers 60 cities across 6 countries, being the UK, Ireland, Poland, Australia, New Zealand and South Africa.

 

gumtree

 

Gumtree uses the lightweight models and cost-effective scalability pattern in a great way because the website is not actually supplying the customers with products for sale, but rather providing the capability in which allows customers to sell to other customers back and forth. With account registration being an optional, Gumtree can easily support millions of users and make money from the users that choose to label their listing as “urgent” for $5.99 per week, a top ad for $9.99 per week, or a highlighted ad for $2.99 per week. This means that Gumtree doesn’t have to worry about the responsibility of storing products for sale, but can safely rely on its users to make them money.

 

This website uses some of the best practices that were discussed in last week’s lecture such as scaling with demand quite well because like I said, Gumtree has expanded to other countries over time as they have realised that other countries were entitled to the service as well. Since eBay is the owner of the Gumtree classifieds website, I believe that eBay would syndicate business models in order to build part of Gumtree on top of components from eBay, making them similar to each other and easy to use.

 

References:

EBay International AG (2012). Gumtree. Retrieved May 13, 2012, from http://www.gumtree.com.au/

Gumtree. (2012, April 25). Wikipedia. Retrieved May 13, 2012, from http://en.wikipedia.org/wiki/Gumtree

O’Reilly, T., & Musser, J. (n.d.). Web 2.0 Principles and Best Practices. O”Reilly Radar. Retrieved May 13, 2012, from http://oreilly.com/catalog/web2report/chapter/web20_report_excerpt.pdf

Advertisements
Categories: Web 2.0

Leveraging The Long Tail

May 8, 2012 9 comments

Well, it’s time for me to discuss the seventh pattern about leveraging the long tail. Tim O’Reilly describes the long tail as “leveraging customer self-service and algorithmic data management to reach out to the entire web, to the edges and not just the centre, to the long tail and not just the head.” A simple way of putting this means that to leverage the long tail, it is important to become more knowledgeable of the products and services that are available on the web, whether they are popular products or less popular products. These two types of products make up the head and the long tail theory; the popular products advertised being the head and the less popular advertised being the long tail.

 

The website that I think shows a good example of leveraging the long tail is eBay. This website is a very popular service for online shopping. EBay has a large variety of items for sale in many different categories covering: alcohol and food, coins, video games, clothing, movies, electronics, home entertainment, and much more. As well as shopping for products, eBay also many auctions that users can take part in.

 

 

I think that eBay has used this strategy quite well because the website builds on the driving forces of the long tail by making their tail longer with more products for the users to buy. The website also uses an architecture of participation to match supply and demand since eBay allows the ability to send feedback to the seller or buyer and give them a rating for each transaction they perform. Users can log into eBay using their own account which gives them greater control and information and also leverage customer self-service to cost effectively reach the entire web. The final best practice that eBay utilises is leveraging the low-cost advantages of being online by keeping inventory costs low and fulfilling an aggregation role.

 

References:

eBay. (n.d.). Retrieved May 8, 2012, from http://www.ebay.com.au/

Kaw, P. (2009, May 5). 6 Ways to Leverage the Long Tail in Your Marketing. Hubspot. Retrieved May 8, 2012, from http://blog.hubspot.com/blog/tabid/6307/bid/4723/6-Ways-to-Leverage-the-Long-Tail-in-Your-Marketing.aspx

Long Tail. (2012, May 3). Wikipedia. Retrieved May 8, 2012, from http://en.wikipedia.org/wiki/Long_Tail#Internet_companies

O’Reilly, T. (2005, September 30). What is Web 2.0. O’Reilly. Retrieved May 8, 2012, from http://oreilly.com/pub/a/web2/archive/what-is-web-20.html?page=1

Categories: Web 2.0

Perpetual Beta

April 28, 2012 5 comments

This week I am going to be writing about Tim O’Reilly’s pattern six, perpetual beta. Perpetual beta is a term that describes how a system or software is kept at the beta development stage for an undetermined period of time. When a program or system is in perpetual beta, the developers should release early and release often, especially if it is internet based software as it is a critical success factor.

 

 

An example I’m going to discuss which can be seen in perpetual beta is the Battlelog internet based software from which the famous Battlefield 3 is played through. The Battelog website allows you to launch the game in campaign, co-op, or multiplayer mode. The user can also see what other players have been up to in the friends’ battle feed. The reason Battlelog can be seen in perpetual beta is because the developers perform updates on this website about once every 1 or 2 months adding, updating things such as the server browser filters, and the create a match feature which is still in the works. The ability to reconnect to the server that was being played if an error occurs and the ability to look at the progression of a round, such has the number of tickets remaining and the scoreboard live updates, were also both recently added as new features to the Battlelog website.

 

 

I think this internet based software is a good example of perpetual beta because it deploys most of the best practices. Updates are released early and often as mentioned above, and users are not co-developers, but are real-time testers. As testers, the forums on the website allow users to suggest improvements and report any bugs that the website may have. The developers of Battlelog definitely instrument their product because they have the ability to see what the users are clicking on and doing on the website to a point that does not break any privacy laws. This allows them to see if there are any problems with the website by observing the actions of the users and also allows them to see if there are any hot spots on their webpage in regards to where the users’ mouse position, which is possible to see using a java script or flash to record the mouse position of the users.  So as long as there are improvements to be made to the Battlelog website, it will stay in perpetual beta.

 

References:

Perpetual Beta. (n.d.). Wikipedia. Retrieved April 28, 2012, from http://en.wikipedia.org/wiki/Perpetual_beta

O’Reilly, T. (2005, September 30). What is Web 2.0. O’Reilly. Retrieved April 28, 2012, from http://oreilly.com/pub/a/web2/archive/what-is-web-20.html?page=4

EA DIGITAL ILLUSIONS CE AB (2011). Battlelog. Retrieved April 28, 2012, from http://battlelog.battlefield.com/bf3/

Categories: Web 2.0

Software Above the Level of a Single Device

April 24, 2012 6 comments

Many people these days rely on the applications that are developed for mobile phones and smartphones. The last thing anyone would like to do while they are on the move, is to pull out a laptop, sit down and find what they’re looking for on a computer. Why do that when you can do it all on your phone? This is why mobile phones are slowly becoming the future of society. This week I am going to write about pattern five, software above the level of a single device.

 

To explain this pattern further, I’m going to use the mobile application Google Places as an example. Google Places is an application that assists the user in finding places to go to that are close by. Once the user has selected a place (eg a particular ATM) by selecting a category or manually searching the name of a place, they will be greeted with a screen that displays the details and ratings of the place, with a button that can show where the place is located on a map, since the application is built into Google Maps. There is also another two buttons that allows the user to call the place or get directions. I think the most important feature of Google Places is that it allows a user to write reviews on the place that they have been to, so that when other users search for a place, they can read the reviews people have left already, which will help them decide whether the place sounds good or bad. Below are some pictures that show the features I outlined in this paragraph.

I think Google Places is a great mobile application for users because it allows them to look up local places easily while they are out and about without the need of access to a computer. I believe that the Google Places app will continue to improve over time with better and more useful features, which could result in more users and therefore make the application more popular within society. Google Places is also a very useful app for people who own a business as it allows them to make a listing to help make their business stand out from others they may be competing with. This is so when users open up the app on their mobile phone, their business will be discovered.

 

References:

O’Reilly, T. (2007, November 28). Software Above the Level of a Single Device. O’Reilly Radar. Retrieved April 24, 2012, from http://radar.oreilly.com/archives/2007/11/software-above-the-level-of-a.html

Google Places. (2011). Google Mobile. Retrieved April 24, 2012, from http://www.google.com/mobile/places/

Hanke, J. (2010, April 20). Introducing Google Places. Google Official Blog. Retrieved April 24, 2012, from http://googleblog.blogspot.com.au/2010/04/introducing-google-places.html#!/2010/04/introducing-google-places.html

Pictures taken from personal smartphone

 

Categories: Web 2.0

Rich User Experience

April 3, 2012 3 comments

This week’s post is about Rich User Experience, which is the fourth web 2.0 pattern that Tim O’Reilly defines.

 

Most people these days use desktop applications such as the Microsoft Office suite to provide themselves with a rich user experience without the use of the internet. However desktop applications are not the only form of a rich user experience. There are web 2.0 applications in existence today that provide a person with a rich user experience online.

 

I’m going to use Google Docs as an example of web 2.0 tool that provides a rich user experience and explain how it does. This application is basically an online alternative to what the Microsoft Office suite is. Google Docs gives you the ability to store the files you are working on in online storage so you do not have to rely on USB sticks or portable hard drives to access your files. All you need is a computer and the internet. Another advantage is that the users can share their documents with other users by the use of a simple link to their file on the internet. Then the other user will be able to open the link in their web browser and view the document. The third advantage of Google Docs, which I think is the most outstanding feature, is that users can collaborate with each other on a document simultaneously.

 

I think that Google Docs is a good example of a web 2.0 tool that provides a rich user experience because it combines the best of desktop and online experiences into one application. It has benefits of a rich user experience which are qualitative, user engaging, customer satisfying, and differentiates from the Microsoft Office suite. Google Docs also puts usability and simplicity first and follows the best practices described for a tool that provides a rich user experience.

 

References:

http://www.thethinkingstick.com/10-reasons-to-trash-word-for-google-docs/

http://www.tstiles.com/dms/web20/richuser.html

http://en.wikipedia.org/wiki/Rich_Internet_application

https://docs.google.com/?pli=1#home

Categories: Web 2.0

Innovation in Assembly

March 27, 2012 6 comments

The next web 2.0 pattern which will be discussed this week is Innovation in Assembly. Tim O’Reilly explains that the web 2.0 mindset is good when it t’s re-used. When a good web service which has highly useful information is found on the internet, it can be combined with another such web service in order to create a new and improved service or web application. This is called a “mashup”. Mashups are the main idea of how innovative assembly works. An example of a mashup is TwitterSpy which is a website that shows the public twitter feed in real time and highlights on a map where each tweet is posted from right when it’s being posted. This is a mashup of Twitter and Google Maps.

 

However, I would like to discuss with you a stronger example of a web mashup called PopUrls. PopURLs is one of the most popular news aggregators and is a mashup of the web’s most visited social news sites that captures the latest headlines in real time. It includes columns that show content from websites such as Digg, Lifehacker, Wired, Reddit, Google News and much more. There is also a section on the website to display the latest and most popular pictures and videos from Flickr, Youtube and Hulu among others.

 

I think that PopUrls is a good example of innovation in assembly because it has so many things to explore all in the one place for the users. As this website is a a mashup of so many different websites this also makes it a good example because the whole idea behind innovative assembly is to place two or more sources together in such a way to create a new web service or application. PopUrls is designed for remixability which allows for digital content to be taken apart and remixed, making this website popular to the users because the content is forever changing as new updates appear, which means the users are likely to return to check out the new stuff. In my opinion, PopUrls has pulled off innovation in assembly because it has taken many different free sources from around the internet, and merged them all into one place.

References:

http://en.wikipedia.org/wiki/PopUrls

http://twitspy.com/

http://popurls.com/

http://www.morphmandude.com/web2nu/?q=node/4

Categories: Web 2.0

Data Is The Next “Intel Inside”

March 19, 2012 6 comments

This week’s blog post is going to be discussing how data works in our present time. Which is why today I’m going to be talking about the second Web 2.0 pattern, data is the next “intel inside”. Data plays a very important role in our everyday lives. There are countless companies that keep their data stored on a large amount of servers in order to save it and protect it.

 

I’m going to demonstrate this web 2.0 pattern by discussing as an example, the famous social networking website known as Facebook. This website is entirely made up of user generated data which is unique and hard to recreate. Like Flickr, the core data is enhanced through the use of profiles, groups, notes, and tags. Each user on Facebook has a profile which is made up of their personal information, a friends list, photos, and a wall, where their friends can post comments and links to videos of photos.

Facebook also carries a set of privacy parameters that allows each user to choose their desired privacy levels, which also allows them to specify who can comment on their photos and profile. Users can also control what portions of their Facebook profile are visible to their friends and to the public.

 

It turns out that the amount of data stored about each user that has a Facebook account in total is in fact 800 pages. It was estimated in October 2009 that the company had 60,000 servers to store all this user data on, but now it is highly likely that they have increased the number of servers. In 2011, Facebook spent $606 million on data centre infrastructure along with the cost of construction, servers, networking equipment and storage.

 

Below is a picture of Facebook’s largest data centre located in Prineville, Oregon, where tens of thousands of servers handle the data that’s generated by more than 800 million users, and this is just one of their data centres.

 

 

I would like to sum up this blog by saying that I think Facebook is a good example of the web 2.0 pattern discussed in this blog post, because they meet just about all of the best practices that we went over in last week’s lecture, and by having the user profiles where personal information is stored, photos, the friends list, and the user’s wall where most of the user generated data is created from.

 

References:

http://radar.oreilly.com/archives/2007/12/google-admits-data-is-the-inte.html

http://newsroom.intel.com/community/news/blog/2012/01/19/a-peek-inside-facebooks-oregon-data-center

http://www.datacenterknowledge.com/archives/2009/05/14/whos-got-the-most-web-servers/

http://www.datacenterknowledge.com/archives/2009/10/13/facebook-now-has-30000-servers/

http://www.facebook.com

Categories: Web 2.0