Posts Tagged ‘Google’

nexus-7-battery-saving-main

As a dedicated supporter of Android based portable devices this is a difficult post for me to type.

I made my first, reluctant foray into the world of the tablet user at the end of 2011 when HP decided to mark down their Touchpad device to a spectacularly low $99. While it ran the slightly left-of-field WebOS the device could be flashed to run a customised Android ROM. As the devices sold out quickly across the world I had to resort to locating one in the US and having it shipped over.

Anyway, I liked the tablet format very much but wanted something slightly smaller than the A4 sized Touchpad but larger than the Android phone I was using at the time, a HTC Hero. Enter the Google Nexus 7.  I bought one of these 7 inch tablets in September 2012 and loved it straight away. It ran Android 4.1, completely integrated with all the Google services I used and most importantly it fitted into the inside pocket of my jacket! No more clunky devices the size and weight of a cut down laptop to lug around. I literally brought it everywhere and used it for everything from work to reading books (yes, it completely converted me to the use of e-books). When my tablet was stolen in New York during my work as part of the hurricane Sandy response I went out straight away and bought a new one. Actually as the Staples outlet I went to had a special offer I bought two! One for me and one as a Christmas present for my wife.

The device was used daily and about 6 months after purchase it started to develop a niggling issue with charging. The micro-USB cable had to be fiddled around a good bit for the device to actually charge and the slightest movement of the cable would stop the charging process. I put it down to a faulty/worn cable-plug and bought a replacement cable. Actually I went to half a dozen cables in about as many weeks. Still the problem persisted. Not only that but my wife’s tablet was starting to have the same issues. Logic made me conclude that the connector/socket on the tablet might have become worn or damaged. We limped on applying tricks such as using a heavy book to keep the connector plugged into the tablet, using something to push the connector upwards etc. These were only temporary solutions as the problem steadily worsened. In early November last year my tablet would just not charge anymore, at all. It was by all means dead. My wife’s tablet was slowly limping in the same direction.

By this point I finally resorted to Google-ing the issue and lo and behold I was not alone! There were hundreds if not thousands of messages on forums and newsgroups of people having the same problem (Google-ing “nexus 7 charging problem” results in 1,600,000 results!) All with Nexus devices that eventually would no longer charge using a cable. Two solutions were mentioned, using wireless charging our using the Asus Nexus Dock (Asus is the manufacturer of the Nexus 7).

nexus charging problem

Armed with this knowledge I contacted Google to see what they were doing to deal with this problem. It was clearly a manufacturing or design issue to logic dictated that they should rectify it. The disappointing reply I received was that I should contact the vendor where I purchased the devices. As this was a Staples outlet in New York City and I live at the other side of the Atlantic I contacted Staples via their Twitter account. All they could tell me was to contact the store itself which could only be done via phone (why email wasn’t possible is beyond me). The store told me that I should return the tablet to them so they could establish what the issue was. However I was notified at that stage that the 1 year warranty had expired and that any repairs would be chargeable. To me. So I would have to ship the device to New York, pay for having a design fault repaired and then pay to have them shipped back to me? No thanks.

So, I went ahead and ordered the Asus dock. From the US as it wasn’t for sale in Europe yet. The dock arrived several weeks later and I could finally charge my tablet and, you know, actually use it.

However shortly afterwards I noticed that the tablet would charge really slowly (when in the dock). I would have it docked overnight and the charging level would only increase by 10% or so. But then sometimes it would charge fully. On top of that it would discharge really rapidly especially when the charge went below 35%. It would sometime go from 35% to empty in minutes.

I resorted to Google-ing the problem again (how ironic) only to discover that this again was a common problem. Apparently devices that were upgraded to the latest version of Android (KitKat 4.4). This happened early in 2014 which coincided with the time our tablets were starting to have these problems.  So not only was the charging port faulty due to a design process now the OS was causing charging problems!

By this point we had 4 Nexus 7 devices in the family as my 2 youngest kids had used their Christmas money to each buy a Nexus 7. I had let them as I had assumed that any problems would have been resolved by now and newer devices would not have ny more charging problems. That was before the upgrade to KitKat 4.4!

I again reached out to Google at this point. Again with a disappointing response. This time they didn’t refer me to the place where the devices were purchased but told me to contact Asus (the device manufacturer) instead. This in spite of the fact that these are Google branded devices. Asus of course flat-out refused to do anything as the devices were out of warranty and the issue was OS related which was not their responsibility.

At this point I have had it and in spite of championing Google Nexus devices in the past I am now at a stage where I probably will never ever purchase any of their devices again. Not because the purchased devices were faulty but because of the unacceptable way they shirk of any responsibility and just pass the buck to a third-party. In fact I am eying up the Apple iPad mini Retina at the moment. Those who know my aversion to iOS will realise what a big shift this is.

I was at a gig this evening when an interesting tweet showed up in my timeline. It said “Major internet outages across America” and linked to a website displaying stats on global internet traffic. The stats for North America showed something odd. Several nodes were not responding, response times were up significantly, packet loss was up and consequently traffic was dipping big time. I thought at first that it might be due to hurricane Sandy but when I looked at the locations they were all over the USA. Weird…

 

I tweeted a few people in the US to see if they had any idea but nothing concrete came back. Some suggestions were made that the traffic drop might be due to lunchtime but that didn’t make sense to me. Especially as lunchtimes across the US are obviously spread out and also because lunchtimes would not cause excessive packet loss. So I decided to have a look at the stats for Europe. To my surprise the same thing was going on there:

 

Next were the global stats which showed the same story:

 

So there has been a substantial increase in response time and packet loss resulting in a big dip in internet traffic, across the globe AT THE SAME TIME…

That was a serious Whiskey Tango Foxtrot situation. So I had a look at the global stats for the last 30 days. These stats indeed confirmed what I described above but it also showed a few other interesting facts;

  • An even bigger blip had occurred around October 8-9.
  • There has been a steady increase in response times & packet loss since October 8th.

 

Now there might be perfectly harmless explanation for this but we are looking at a homogenous network across the global with components managed and owned by a mix of private and public sector organisations. To see a global trend in performance across the whole network for a prolonged time is something that raises questions. So I decided to run a quick search for other reports on this and stumbled across an article by ABC news. The article basically reported in a bit more detail what I outlined above. It reported that Youtube, Amazon, Google’s App Engine, Tumblr and other sites had been affected. However most interesting was that there was no explanation offered for what might have caused this degradation.

I will keep digging..

Enhanced by Zemanta

It’s been only 48 hours since I wrote my blogpost on how to track a “troll” online. The blogpost itself was inspired by Leo Traynors story how online trolling and harrasment crossed over into real life and how he managed to find his tormentor. Since then I’ve had several thousand hits on that particular blog-post and have received phonecalls and emails from different media-outlets with questions on this topic. It’s obviously a hot issue…

My blogpost was not meant to serve as a manual on how to track someone online but was more as an insight that, yes indeed, you can legally track someone online and find out their identity and/or location. It was however also meant to serve as a warning of sorts on how much private information people put online using various social networks. This second issue needs elaborating on in my opinion as it’s an often ignored issue or at least one that elicits a lot of ignorant commenting.

First rule of online privacy: DON’T PUT ANYTHING ONLINE THAT YOU WOULDN’T SAY TO A COMPLETE STRANGER!

The above is the simplest but most effective rule; don’t make any comments about someone online that you wouldn’t say to their face and don’t put any images online which you want to keep private.  Adhering to that rule will save you a lot of trouble. Also remember that anything online, once it is indexed by Google, will stay accessible online forever. That’s right, Google caches every website that it indexes. That means that there will be a publicly accessible copy of that content on a Google server. Google will in certain instances remove content from its servers but rarely because the content is offensive or untrue and this is even less likely if you are not the owner of the website. So getting content which you put on LinkedIn, Twitter or Facebook and have since removed to be also removed from Googles cache is as good as impossible. The point is to *not* put said content online in the first place.

Second rule of online privacy: USE YOUR PRIVACY SETTINGS!

Most social networks have privacy settings. USE THEM. Even Twitter let’s you protect your tweets by setting your account as private or just straightforward block people. Note: Not a lot of people realize that if they block someone on Twitter that the blocked person can still read their tweets when they run a search for them. The only way to really prevent someone from seeing your tweets is to protect them.

On Facebook you have a lot more flexibility in regards to your privacy settings. You can have one setting for who can see your details, another for who can see the images you upload and so on. It gives you multiple levels of control. USE THEM!  There is no reason why something that you put on Facebook should be seen by someone who you do not want to see it.

Third rule of online privacy: WHAT HAPPENS ON THE INTERNET STAYS ON THE INTERNET!

Yes that’s right; anything that is put up on the internet (websites, blogs, social media and *everything else*) stays on the Internet. Forever. The reason for this is Google. In order to be able to serve you with these fantastic search results Google uses software (so-called spiders) to index everything on the internet. Once they have indexed the content of a page Google stores a copy on their own servers. This process is called caching. So if you have put something online, once it’s indexed by Google (and this is done very quickly) it is there for all eternity. You can remove the content, delete the page and even format the server that it was one but it will still show up in Googles search results and these search results will link to a copy of the content in Googles “cache”.  Of course you can attempt to get Google to remove the content from its cache but this will eventually result in the need for legal action with a limit success rate. Not a lot of people have the energy or more likely the funds to go down this route.

So, should you put nothing at all online? While this is obviously the most foolproof route to protect yourself from embarrassment it is not necessary. You can still be a prolific social media user without exposing everything about yourself. Take my own case, I blog, have 200,000 tweets to my name, check in on Foursquare regularly and much, much more. However not *everything* I do finds its way online. If I go somewhere, or do something that is private I just refrain from tweeting about it and certainly don’t check-in while doing so.  By being such a prolific social media whore while leaving private matters out it also creates a case where one can’t see the forest for the trees.

Enhanced by Zemanta

Tax is evil; there I said it. The organised extortion of money by the state sanctioned by a threat of violence or incarceration is something I principally object to. However I am realistic enough to understand that “we the citizens” need to make a small contribution towards the running of the state apparatus. A flat tax would be the most equitable way to do so and would allow for the abolition of all tax loopholes as well as so-called stealth taxes, sales tax, duties, excise and what more.

However I am digressing from the topic of this blogpost….

Irish corporate tax and specifically the low rate is a hot topic of conversation both in Ireland and across the EU. The debate mostly centers around whether Ireland should be allowed to hold on to this low rate which on the face of it has attracted such giants such as Google, Facebook, Linkedin, Dell, Microsoft, Apple and a whole raft of other big players to the country. Other EU countries rightfully seem to think that this low corporate tax rate (12.5% compared to for instance 28% in the UK) gives Ireland an unfair advantage. It turns out that they might be wrong…

During an exchange on Twitter with the fabulous @dhkirk yesterday it expired that even though the corporate tax rate is substantially lower than most other EU countries most of these multinationals only pay that tax rate on a small percentage of their revenue. See, @dhkirk was researching this to ascertain the validity of InvestNI’s statement that a lowering of the corporation tax in Northern Ireland would result in it being just as attractive to large corporates as the Irish Republic. You can read his blogpost here.

The common perception is that the large corporates sluice all their European revenues into their Irish corporate entity through the use of licensing agreements allowing them to only pay Ireland’s 12.5% corporate tax rate on not just the Irish revenues but almost *all* revenues across Europe. It now turns out that this is only part of the chain. Apparently because of a quirk in Irish law, if the Irish subsidiary is controlled by managers elsewhere, like the Caribbean, then the profits can skip across the world tax-free. This (legal) construction is known as a “Double Irish Sandwich”.  Let’s try an example; ACME Inc has offices all over the world. It now register a corporate entity in Ireland. Let’s say it’s Called ACME Eire Ltd. Management of all patents and intellectual property regarding ACME Inc’s products is transferred to ACME Eire Ltd. At the same time ACME Inc. sets up a corporate entity in a tax haven (such as the Bermudas, Cayman Islands etc.) It then assigns the *ownership* of all patents and intellectual property regarding ACME Inc’s products to the corporate entity based in this tax haven. This construction than results in all ACME Inc. global companies globally are billed for us of these patents and intellectual property by ACME Eire Ltd. These fees paid to ACME Eire Ltd. can be as high as most of their revenue. ACME Eire Ltd. in return pays an “administrative fee” to the entity registered in a tax haven. ACME Eire Ltd. only pays the low Irish corporate tax rate over a fraction of its revenue. In the case of Google it reduced the companies taxable revenue in Ireland reduces its gross profit from €5.5bn to just €45m.

It appears that it means that it’s not Irelands low corporate tax rate which makes it an attractive location for multinational but rather their specific tax legislation allowing this construction. The NY Times has produced an excellent illustration of how this works. Click on the image for a detailed explanation.

Based on this it would appear that all the campaigning for Ireland to hold on to its low corporate tax rate as a change might scare away these large multi-nationals might not have been fully informed. Full disclosure requires that I admit that I have used this argument also. However I have always followed it by stating that Ireland should develop other means of being competitive than just being the cheapest tax country. Based on the information outlined above I would suggest that an increase in the corporate tax rate might not have such detrimental effects. But please remember that I am *not* a tax expert.

Following are some sources of supporting information:

http://www.irishtimes.com/newspaper/finance/2012/0514/1224316065838.html

http://www.businessworld.ie/livenews.htm?a=2941771

http://www.guardian.co.uk/business/ireland-business-blog-with-lisa-ocarroll/2011/mar/24/google-ireland-tax-reasons-bermuda

Last week myself & @mrs_Bopp had the pleasure to attend the Aruba Networks Airheads conference in Nice France. We were there by invitation of Aruba Network resulting out of our work done through Haiti Connect to which Aruba had donated a substantial amount of hardware back in 2010. I had a good amount of communication with Aruba staff prior to the event and was really looking forward to the event. The fact that it was in Nice might have contributed to that also. The flight from Dublin to Nice was quite uneventful but the arrival was slightly dampened by grey skies and rain!! As we had just left a *sunny* Dublin this was not what we expected. Luckily this was compensated for by the welcome in the hotel which Aruba had arranged for the conference attendees. The venue was very slick and polished and the Aruba crew was all smiles and very helpful. We finally got to meet Jeanie, Sue & Bart with whom I had contact prior to the event. As our flight hadn’t arrived until 4 pm there was no opportunity to attend any of the Monday workshops but there was the cocktail party that evening to look forward to.  We brought our bags up to the room, plugged in a multitude of devices to charge up and discovered that the *free* wifi was limited to 2 devices per room. The signal wasn’t very strong either. Clearly they weren’t using Aruba equipment 😉

The cocktail party was very entertaining with copious amounts of food & drinks. It was great to meet so many different “Airheads” from all over the globe. At one point we were in a discussion with people from Ireland, the UK, South Africa, Saudi, Sudan, Netherlands and Austria. The discussion ranged from wifi to politics, the price of petrol, taxes and more. All in all a very invigorating and entertaining evening. Bed came at 1 pm and was very much needed after a 4:30 am start.

After breakfast and some excellent coffee Tuesday started of with a splash caused a James Bond type opening show followed by the opening word by Duncan Fisken (VP EMEA) followed by Keerti Melkote (Aruba Networks founder & CTO) who spoke about Aruba Networks’ Technology Vision. Keerti’s talk was very interesting as it gave a clear insight Aruba’s approach to the development in user connectivity requirements, device ownership, usage patterns and how to make all this manageable.  (I’ll ad a link to the presentation when it is available online). Next there was a short break followed by a client panel on BYOD issues after which it was time for me to take the stage to talk about the use of WiFi in disaster response and about the work of Haiti Connect. I wasn’t sure how the presentation would be received but I can now safely say that it went down very well. The slick set-up with two monitors displaying the slides for the speaker as well as very visible timer made it very easy to speak in a coherent manner. I easily filled the 45 minutes allocated.

<lunch>

<eat>

</end lunch>

After lunch there were a number of break-out session giving some hand-on experience of different WiFi related technologies and applications. I first attended the one on “Designing Outdoor Mesh” which was a delight as it clearly dealt with issues such as antenna modulation, signal propagation, interference etc which area areas that are not covered often enough in WiFi network design. It also provided a good insight in the various mesh network topologies and applications as well as use of Arubas Outdoor RF planner.  Next were two sessions on “Advanced wireless security” (interesting but I was starting to flag a bit and needed coffee) and one on “Clearpass access management”.  The Clearpass session was a real eye-opener. Aruba is clearly on to a winner with its solution to the BYOD issue.  While it allows for excellent user & network management the really impressive feature for me is the easy device “onboarding” which allows a user to easily connect to the network and authenticate after which pre-set network policies are applied according to user, device and application. This means less work for the network support department as well as the fact that users can do this in “remote” locations where there is no direct access to network/sys admin people. I can see this working very well in disaster response scenarios where a network is rolled out quickly and where a very heterogeneous user environment exists. Policies can be pre-set or easily added or changed by network admin staff without the need to come close to any of the users or client devices.

After a long day of seriously getting ones geek on it was time for some top class relaxation and Aruba had really pulled all the tops out on this one! We were bussed to “Chateau de Cremat”  which is in a stunning location on a hilltop North of  Nice with magnificent views across the mountains and the Mediterranean in the distance. After some Casino Royal themed entertainment we were treated to yet another excellent meal and some very drinkable wine from the chateaus own cellars. Around 10 pm the busses departed again and while some hardcore people wanted to go clubbing in downtown Nice we decided to go back to the hotel and have a few more beverages on the rooftop terrace.

Wednesday unfortunately heralded the last day of the Airheads conference but luckily it went out with a bang in the form of tow excellent presentations by Dominic Orr (CEO, Aruba Networks) and Mike Wiley (Manager Global Networks @ Google).  Dominics presentation was titled “License To Win” and ran us through Aruba’s technical & strategical development right from the beginning to into the next few years. It gave an excellent insight not only in Aruba Networks but also into the market that it operates in.  Mike Wiley’s presentation was titled “Google’s Global WLAN Deployment” but it dealt with more than that. It illustrated clearly how people require ubiquitous connectivity, how they benefit from this and how to best deliver this.

What I have come away with from this conference is the impression that Aruba is very much on top of their game with both their hardware & software products. Both are excellent product ranges which complement & support each other. There is the obvious debate about controller vs controller-less architectures but with the Aruba Instant AP’s they are moving in both areas now. While I’m very much a hardware kinda guy I have learned through experience that a network deployment & management tool which is intelligent and adaptive is worth its weight in gold. Aruba’s MOVE & Clearpass hold great promise in that area and I can’t wait to test it in the live environment!

Enhanced by Zemanta

Wimax 4 years on….

Posted: November 9, 2009 in Uncategorized
Tags: , , , , ,

The recent launch by Imagine Telecom of their Wimax service in Ireland and all the following discussions on the fact that the service wasn’t available yet, the lack of pricing & package information as well as the misguided marketing campaign prompted me to write a blogpost about by experiences with Wimax.

imagine_this_wimax

I have been working with wireless data technologies since 1999 and by 2005 I was what one might have called an “expert” in the field of wifi. In the 1-2 years up to that time I had also started to looking at a new emerging technology called WiMax (Worldwide Interoperability for Microwave Access) going through a development & ratification process with the IEEE’s 802.16 working group.
In short WiMax is a wireless data protocol that operates on basically any frequency below 66 GHz. It is used to create a wireless data network over long(er) ranges than for instance WiFi and at a higher data rate.One of the common misconceptions is that Wimax can do both; provide a very high data rate at long distances. Claims of a 70 Mbit/s over 50km are still common in the press. Nothing is further from the truth. WiMax can do one or the other; at short distances (up to 2-3 miles) speeds will be more likely in the range of 5-7 Mbit/s. At longer ranges the data throughput will decrease the further one gets away from the base-station/mast/cell-tower. Data rates at 50 km would be in the range of kilobits and as good as useless. Another issue to consider is that we’re talking about 2-way radiowave communication here which means that the client device (your WiMax CPE) will have the “power” to transmit a signal back to the nearest mast/cell-tower.  An additional point to consider is that the available spectrum on a base-station will have to be shared with all other users; this means that the available spectrum per users will decrease in significantly in areas with a high population density. This will most likely result in a lower bandwidth per user.

All this aside Wimax still offers some great advantages over its predecessors (such as wifi). The available bandwidth & range is much higher than with preceding protocols and in addition it is a NLOS (Non-Line Of Sight) service. This means that it is not necessary for the base-station and the subscriber unit to have a direct line of sight (be able to “see” each other). Objects such as buildings, hills & trees will no longer be the obstacle that they were. There will off course be a certain level of signal degradation but the coverage area of a WiMax base-station, or more precisely the number of possible subscribers within that area will be much higher. I won’t go into detail on the two Wimax standards (802.16d & 802.16e) for fixed or mobile WiMax as this would have me digress form the point of my article even further.

wimax_wixd_101

Anyway, by the middle of 2005 I had read up a fair bit on Wimax and saw that the was potential but also that there was a lot of hype surrounding this up-and-coming technology. I frequently discussed it with other people in my profession and entered in discussions in newsgroups (I wasn’t blogging yet and there was no twitter then) I also discussed it with manufacturers of wifi hardware who were testing or exploring WiMax. Still it came as somewhat of a surprise when, in August 2005, I was approached by a group of US based investors and asked if I was interested in heading a venture to establish a pan-European WiMax provider. We did a lot of talking back and forth and I eventually agreed to do the groundwork for this venture with a possible CEO position down the line. I established a company office in Ireland and went about exploring the market, immersing myself in the technology, developing potential partnerships, writing the business-plan, drafting financial forecasts, operational plans etc. etc. What became clear to me was that there were two viable business models:

  1. A WiMax provider with our own infrastructure but where we would concentrate on providing backhaul services for other operators & alongside providing services (SaaS) across our network.
  2. Become a type of “virtual operator” by buying capacity on existing WiMax networks owned by other operators and providing services over this network

As you notice neither of these would incorporate providing “just” broadband connectivity. That choice was deliberate simply because research had shown that this wasn’t where the real revenue was made. The market was simply so competitive and margins that low that it would be extremely difficult (or require very large investment) to successfully enter as a new provider. Another issue that made me reluctant to go down the route of building our own network was the cost of a license to use the needed spectrum.

wimax license cost

But the real difference was in how we planned to market it. No big “traditional media” campaign. Instead I had planned several key actions:

  • Do not launch a service/product until it is actually available and a customer can sign up and be connected within 48 hours.
  • Create an online “buzz” prior to a launch by creating a clear online presence and actively engaging with people via the social media channels.
  • Engage with a number of “tech pioneers” and allow them to test the service and provide honest feedback. Listen to this feedback and adjust service if/when needed.

There were several other points and this is only a fraction of the overall plan but it is significant in regards to the rest of my blogpost. When I had completed all the groundwork for this venture it expired that there just wasn’t the investment power required to carry this out successfully (I indicated 100mln+ requirement at the time). So the project, and the SEC listed company, were put on the back-burner. Incidentally if there are any investors out there looking for a “WiMax operator in the box” (ready to go, SEC listed & prepped to go to the market) feel free to contact me.

Anyway, I moved on into my next venture, Airappz providing a location-based advertising service based around wifi hotspots ( both in Ireland & abroad). However I kept my finger on the pulse of Wimax. So I wasn’t surprised when in May this year I was introduced to someone working with Imagine Telecom who were planning to launch a Wimax service in Ireland. Basically he wanted to discuss WiMax with me and more specifically how to market the service. We had a few chats and I suggested they’d consider some of the points made above. Shortly afterwards they sent out a press release announcing a partnership with Intel whereby Intel was investing 100mln (remember the figure I mentioned earlier?) in Imagine Telecom in order to fund a nationwide Wimax rollout. A detail to note here is that Imagine Telecom bought Irish Broadband from NTR in 2008 for +/-47mln. IBB was using Alvarion Breezemax technology which was a pre-wimax service. So they were building on this existing network and upgrading & expanding it. While I was in the planning stages for the US owned company we had also had very preliminary discussions with IBB at the end of 2007 exploring the option of purchasing the company, so it all spins back into itself somehow. We didn’t go ahead because we didn’t agree with the valuation put on it by NTR and in the end it turned out we were right.

After Imagine’s press release nothing much seemed to happen until early October when they had a big product launch. There was a big dog & pony show in Dublin followed by a big media campaign. What? Yes, a big m-e-d-i-a campaign. Big newspaper ads and huge billboards telling you to Google the term “wimax” to find out more. Which was pretty damn dumb as until only a few days ago Imagine didn’t even rank in the top 10 search results for “wimax”. During the launch all kinds of predictably hyped claims were made. Luckily we had someone tweeting these statements so they are recorded for future reference. You can check them here & here. The biggest lie was that they claimed to: “obtained enough spectrum to deliver future Wimax speeds of 40, 60, 100mbs“, that’s very significant as the IEEE’s 802.16 specification indicates a maximum speed of 70Mbs. So, if you were impressed by the big launch could you sign up for the service? No you couldn’t as no price & package information was available until this weekend more than 3 weeks after the big launch. This delay which I assume was intentional lead to a lot of speculation and more importantly negative comments on Twitter, Boards.ie and other online networks. Didn’t anyone in Imagine or their PR company realise that if you create a big expectation followed by a silence that people will start to talk & speculate?! When you create customer expectation it is essential that you can live up to that expectation, right there and then.

Now don’t get me wrong; I think that the arrival of a Wimax operator in Ireland is a significant step forward towards quality & affordable broadband. The packages that Imagine are offering on their website look very good and extremely well priced. Coverage seems to be mostly limited to the larger urban centres which is to be expected and you can see the existing IBB infrastructure as these are all the green markers on the map. I for one would certainly sign up if I lived or worked in their coverage area. Another great plus is that for an extra 5 euro p/m you get a Wimax dongle for broadband access on the go. I see that eating into the current 3G market share. I do wonder though if they use 802.16d or 802.16e for their mobile access as this will make a huge difference in being able to use it “on the move” i.e. in a moving vehicle and crossing over between base-stations.

SEO & a distorted reality..

Posted: November 2, 2009 in Uncategorized
Tags: ,

seo-image

I’ve noticed a lot of discussions about SEO (Search Engine Optimisation) lately. It’s is the skill of working the content of a website and the inbound & outbound links in such a way that your site receives a top-ranking in the relevant search results. There are some really smart SEO experts out there you will get your site to the top listing in relation to certain keywords.

However I have certain questions in regards to all this. If the top 5 results of my search are there because of smart SEO and *not* because they : 1) have a really good product/service 2) are excellent in the way they do business or 3) lots of people are mentioning them online, then how relevant does it make these results? If they have a crap product & terrible customers service but spends lots of money on good & relevant SEO they will still end up at the top search results. So what do the top search results mean then?

Personally I think the 2-3 lines of text below the title of each search result says a lot more. I normally scan these bits of text in my search results for relevancy to my query. This means that more often than not it will be result 5 or 7 or even 12 that I might click on rather than 1, 2 or 3. As for Google Adwords; I can’t remember the last time I looked at those or if I ever clicked on an ad.

The relevancy becomes even greater if & when you move away from Google as a lot of SEO specialists will use the “price” of a Google keyword as an indicator of it’s relevancy and will adjust their SEO-work accordingly. Other search methods and engines might give rather different results. There are certain search engines that are specific to a particular line of work or research. Getting a good listing on these will solely depend on real reputation and not on “worked” relevance. My personal favorite search engine is Copernic. Copernic has pulled up relevant results for me that no other search engine (incl. Google) has ever found for me. The best one was my grandfathers name in a scanned newspaper page from 1903.  No other search engine has ever been able to find this page.

So how important is SEO? That’s up to you to decide. In my opinion it is a prefereable to ensure a good reputation (as in being a reliable business with a good product or service) and getting lots of positive online feedback clearly mentioning your product or your business name. Twitter, Facebook & blogs are ideally suited for generating this type of feedback. I would be more inclined to buy a product that got lots of positive mention on social networking sites than one that (solely) got a consistent no.1 ranking in Google. But that’s just me….