Browsed by
Category: metrics

Is google analytics shit? WordPress metrics for SEO.

Is google analytics shit? WordPress metrics for SEO.

Background

So, I’ve been running this site for around 1 month now without hitches, the homepage sorted and the page structure hieracy as it should be.

I’ve had analytics on it from the start.

My current set-up is:

Apache server 2.*
Wordpress v 3.5.1
PHP 5.3.3-7

I’m not new to this game, I’ve been running a blog, this blog since 2003.

This month has been the first full month when I’ve been able to sit back and analyse things being happy that: the platform, the site, the applications and the product as a whole was working as it should be (sitemaps and all).

Method

Currently the method I use to analyse stats falls under two methods:

  1. Google analytics: Using the WordPress theme editor, I go in and edit and insert the analytics code below the http header at the opening of the head tags.
    The first thing to hit after definitions.
  2. Using log files: Law log files downloaded directly from the server. Used Weblog expert lite logfile analyzer lite in this case, (although I know there are better software’s out there that can provide a better job).

Results

Using the month of April. We are going to look at the disparity and breadth of information, accuracy in information and trust index for the two products.

Looking at the Key Indicators for site metrics: For a blog taking the key data as follows:

Visitors

Taking a look at the visitors graph.

You can see the Google graph scraping along, showing no improvement over the course of the month.

On the right, the Weblog expertlite graph shows a pronounced increase toward the end of the month (co-incidental with my inclusion of sitemaps to both bing and google).

Google shows roughly 12-14 visits per day on a 500 page+ site, Weblog expert showing 200-400.
In this month, I introduced a full set of comprehensive sitemaps in the prescribed methods to both Bing and Google.

Google analytics visitors April
Google analytics visitors April
Weblog expert lite visitors
Weblog expert lite visitors April

Entry pages

One of your key indicator stats.

Possibly the most important, taking time as a consideration, as to the way you want your blog or content to go.
If your metrics software can provide you with an accurate picture of this, (combined with the other key stats on here), it’s the key to the success or failure of your site.

Google analytics: As you can see, effectively, the only URL the Google analaytics only show the /blog/ root directory getting hits of any significance. This URL is effectively a 404, and as a landing page, insignificant.

Weblog expert lite: Shows the wp-cron activities of my wp app. Not very helpful. Stats like this clutter up the real information, and on the full version can be removed by setting up filters. The next stat down is the ‘https://www.michaeltyler.co.uk/ etrex-garmin-gps-to-google-maps/‘, traditionally, as an ex-geoblog, on of the biggest attracting USP’s of the site in terms of content. This post used to tell of how to convert a set of waypoints directly from a handheld device, such as an eTrex to your Google map.

Google analytics entry pages April
Google analytics entry pages April
Weblog expert lite entry pages April
Weblog expert lite entry pages April

Search phrases

Another key indicator. Where the traffic that comes to your site comes looking for. Having a grip on what people look to you to tell them about? Any use? Maybe not if your analytics software is telling you erroneous or useless information.

Check this out.

Google analytics: Google, as you can see, is passing what it’s managed to gather, which seems to be very little. Taking into account, this is only ‘visitor’ referrals not calls on the server; it appears there’s actually very little of interest on my site. Most pages only gathering a maximum of one search phrase.(!?)

All spam in effect?

Weblog expert lite: Weblog expert lite shows a different picture altogether. Going from requests on the server, as opposed to ‘visitor’ stats, I can see a variety of searchphrases pointing at searches like ‘manastir ostrog’, ‘montenegro’, ‘ostrog’ and all manner of phrases using  the giant walrus I blogged about https://www.michaeltyler.co.uk/i-am-the-walrus/, (which actually exists in a zoo in Dortmund), or the shot of Kim-Jung-Un’s Penis. All of this is helpful to me. I can use these images to promote my site by inserting some accreditation in the way of a watermark. Plus I am paying for the bandwidth, it’s placing a load on my server, and it’s costing me money.
I want to know about it.

Weblog expert lite search phrases April
Weblog expert lite search phrases April
Google analytics search phrases AprilGoogle analytics search phrases April

Top referring URL’s

A list of where your requests are coming from. If there’s juice being sucked out of your site, this is where it’s going to.

Weblog expert lite: Weblog expert lite traces the top 50. Two of these I’ve tracked down and asked to comply with the sites copyright terms and conditions. The rest are easy to ban using the robots.txt or .htaccess file if they become anything more than an annoyance.

Google analytics: No stats available.

Weblog expert lite top referring urls April
Weblog expert lite top referring urls April
Not available in Google
Not available in Google

Error reporting

Sometimes, a lot of the time, you’re going to find that visitors are going to find parts of your site which are no longer there. They may have changed, changed location or disappeared permanently.

Whatever has happened, it’s up to you to track these visitors and guide them along their in the best way you see fit.

Weblog expert lite: Weblog expert lite sees a number of 500 server errors, all occurring on the same date. It reports a standard amount of 404, (below the hundred mark), which is probably about the normal level. It also shows me the site was having a WordPress error.

Google analytics: Google tells me, nothing. Yep. Absolutely nothing about who’s reached my site in error. Nothing about about the dates or times they’ve visited and found it on error or the errors they’ve experienced. Not the best, you can imagine. Not even professional, and certainly not acceptable for anyone running a professional site.

I’ve started using this service, pingomatic, based in Sweden, they monitor your site status and response times. Currently one site is free.

Weblog expert lite error reporting April
Weblog expert lite error reporting April
Not available in Google
Not available in Google

Browsers

The browser is the users first point of entry onto the world wide web under most circumstances. Information on the browsers people use are going to be important for issues such as

  • Compliance: How your site functions according to W3C standards
  • Design: Not much use if your site runs well on only i.e. 3.x (back in the 90’s sometime) and you’re rendering using <font> tags! Nor if you only render on the MOST up to date versions of ie, firefox or chrome. You’ve got to be aware of percentages involved in the various release brackets and cater for the maximum breadth of browser platforms for the minimum potential cost.
  • Visitor OS platforms: Modern OS platforms, each has it’s own browser type. This is especially true of the touchscreen technology which is proving more and more popular for those accessing the medium.

Weblog expert lite: Weblog expert lite has some drilldown, from the stats it’s easy to identify there’s been a sudden pick up in the usage stats from safari mobile browser. Also activity there from the Android and Galaxy mobile platforms.

Google analytics: Looking across to the Google equivalent, I can see we’ve got some stats which seem to tell exactly the opposite story. In terms of mobile users, Google analytics tells me their Android platform is providing most of the traffic. Indeed this maybe true of using the 14 visitors a day site sample they use, which I also believe is an error.

Weblog expert lite browsers April
Weblog expert lite browsers April
Google analytics browsers April
Google analytics browsers April

Summary

I have some reservations about the way WordPress published the tracking code. But having checked pages on site, I never came across a page where the Google code wasn’t present.

Google is woefully under-reporting my visitor numbers.

It’s not reporting key metrics or reporting them inaccurately.

It’s not able to provide information on who is leaching my site and causing possible infringements.

It’s not able to tell me about potential errors, or when they’ve occurred.

Recommendations

If you’re using a WordPress site and you want to boost your readership, seo effectiveness, metrics analysis and intelligence. Consider using log file.

As my little experiment has proven, the stats you get from logs are uncovering a number of key seo components which Google has overlooked, doesn’t care about, can’t measure, or is looking to make money on in the future and won’t include, (that wouldn’t be like Google would it)?

To take the leading hand in what your blog’s steerage is, you’re going to need the whole and accurate picture. Maybe this means drilling down into the minutiae of your referrers raw logs, getting your meta-hands dirty, spending a little time. In my eyes something that’s infinately better than being handed things by Google which are plain wrong.

For me and my WordPress blog site, log files for now, at least until I hit around the 10,000 per day mark again. 🙂

ps. If anyone else has had any experience about implementing Google code on the wordpress platform. What kind of metrics you’ve used to make your blog more popular, and whether you think the stats are better or worse than more traditional methods, I’d be interested to know.

Why I hate Google

Why I hate Google

I hate Google because they lie.

google-is-greedy
Google manipulates results to make more money

Let me explain.

Background

I’ve been an Internet publisher since 1998, back in the day when people were using AltaVista as their main search engine, Netscape navigator was main browser, Yahoo was a new invention and Usenet was the in-thing (social networking hadn’t been invented then).

I was a publisher when Google bought the trust of the internet community with a repository of old musings in something called the Déjà-News archives

Even the spammers thought Google was alright.

Florida Update (Feb 2005)

Spammers spammed, hoaxers hoaxed, phishers phished, the good generally mixed with the bad and the Ugly.
Google said, this is a bad thing.

“Why do we need all these spammy sites all mixed in with the good people trying to make honest money, Serg?”

In 2005, Gooogle rolled out a massive update. Called it Florida, it took a chunk of sites and cut their traffic by half.

Of course, everyone in the industry panicked.

But there was method in this madness

Google were clever. Those sites which were spammers, made busy changing everything, changing:

  • their page names
  • their page’s content
  • keywords densities, descriptions
  • changing backlink text
  • getting more links through publishing more spam

Just about anything anyone could suggest that would improve their rankings again.

They made a lot of changes.

By calculating changes and comparing it with how willing webmasters were to change their site over this period, (about 4 months), Google could make a guess as to that sites actual worth. Legitimate sites might see the ability to adjust perhaps 10% content for the sake of a better ranking, whereas a site with useless content would change anything.

Simple.

Those who adjusted little, (there was really little to change, believe me), stayed in. Those who adjusted a lot, went out.

Penalty Update (October 2007)

In October 2007, another update took place. Again, my site took a massive hit.

Google vaulted it as a punishing paid links** update.

My site has never accepted paid links. Nor ever paid for links.

I said nothing, my site doesn’t publish spam, nor does it undertake practices not sanctioned in the ‘Google official Webmaster Guidlines’. It’s taken me from then until now to put 2+2 together and figure out exactly what’s happened.

Manually altering results
Google in the October 2007 began manually altering search results to generate more income in line with the forthcoming changes in it’s ‘Adwords’ policy released this May.

Previous to May 2008, Google wouldn’t allow bidding on trademarked terms.
For example: If you wanted to buy a climacool Adidas jacket you would type that into your search engine, chances are you’d come up with the manufacturers site first, then a number of other sites, also selling that particular product.

That’s where my site lived for a long time.

I sold budget flights both on a commissioned and non-commissioned basis and was the first site in the UK to do so.

My rankings were up until October 2007.

I directly attribute the May 2008 changes with the October 2007 update.

Here’s the stats.

keyword analysis 2006-2008_Page_1
Download PDF

I’ve sat back. Because, I’m fairly sure my site’s as good as it can be. It’s never broken the rules and as such, there’s nothing I can do.

Net effect of changes to Google Adword policy

Reduce the Diversity of the ‘net: Mom & Pop sites that have traditionally employed SEO techniques to differentiate themselves from the crowd are going to find their business reduced, why would Google send visitors to a site where it can find all the best goods when it can take $1.80 per Adwords click from someone who can afford to pay?
It won’t.

This will reduce the diversity of the ‘net, putting smaller, more efficient operators out of business and handing more money to corporations who take their place.

Channel more money to corporations: Google is currently straddling between freedom of information, (the spirit of the net), and corporatism. Freedom of information means exactly that, free.

Organic listings* represent free information.

Through manipulating and removing sites with high quality, focused information and allowing trademark orientated Adword adverts to appear alongside poorer results instead, Google is skewing the pitch toward corporatism.
Those with advertising budgets and skilled staff to administrate complex Adwords campaigns in a meaningful fashion.

Google likes to hide behind the veil of ‘freedom of information’ whilst reaping the profits of corporatism.

Stymie new development: I’ve developed a substantial number of pages, and Google refused to list them. (See thread).

Don’t assume, if you’re a developer and develop a top new app or service it’ll get listed. If it interferes with Google’s business interests, Google either pull it when it realises it’s making money, or not list it in the first place.

This stymies bedroom developers who depend on a traffic flow to promote their product or service.

In the past Google could be relied on to provide that relationship. That isn’t the case anymore.

Reduce search quality: If Google is prepared to sacrifice search quality in the name of ‘Adwords’, this signals to Engineers to have their own ideas.

Google have moved away from the purity of information weighted results making this a breeding ground for opportunist Engineers who stand to politicise search on whoever pays them to present in a positive light.

Search results of the future could be ruled by a cabal of powerful search engine engineers wearing funny hats, (much like Google Webmaster forums are now), thrashing out each others ego’s on innocent webmasters below ‘lawnmower man’ style whilst handing the best listings to those who pay them to speak up in their favour.

Google Summary

  • Google manually manipulate SERP’s in the name of business interests.
  • Google pretends, through it’s Webmasters Guidlines, you will get high rankings if you do the right things.
  • Google’s Déjà-News, freedom of information, spirit of the ‘net days are dead.
    If you want an impartial search engine, build your own.

* Organic listing is a listing that appears on merit.

** Link are a means of valuing a site’s worth in internet terms. The more people link to your site, the more respect you get in the community, the more your site is worth.

RFID – Being aware

RFID – Being aware


I know evil doesn’t exist, except for in people(like me)’s imagination, but here’s a real excuse to believe everyone has good intentions.

It’s called RFID or Radio Frequency ID.
First employed during the second world war in allied bombing campaigns, the industry has seen some growth and is now worth $2 billion.
This is just the tip of the iceberg.

RFID opens all sorts of opportunities for companies to monitor our movements real time.
For example, you may not be aware of it, but if you are the owner of a bottle of Viagra, Pfizer may be monitoring you right now.
Think I’m joking.
I’m not.

They’re not the only ones, American Express have new card issues fitted with RFID chip. Procter and Gamble have filed patents and NCR are amongst other big names looking to muscle in on the act.

What’s it all about?

RFID Monitor using a semi-ductile, lightweight tag with a low cost.
When fitted to a product they relay information back to a receiver.
You may have seen them on new items of clothing or taped to new CD’s.

Any product fitted with RFID within range of a receiver would provide information; a geo-fix and product beacon. Leading to data on; when those products were bought, how long you’ve owned them, how much they cost, where you bought them from.
If you kept them on your person, when you returned to refresh your product. Where else you went in between.

Some RFID have more important information.
The AMEX RFID contains personal banking information on the card holder.

The worrying thing is, RFID information is gathered without regulation.
Information can be sold on without regulation.
No Data Protection act here.

FMCG Future
With reduction of tag implementation below 1p in the near future, support from heavyweight FMCG companies, Banking and technology institutions, multiple patents filed and industry screaming out for large quantities of qualitative marketing information.
The future looks set.

Expect these snazzy labels to be appearing near you soon, whether you believe it or not.

Copied and pasted from: BBC News Website.

Lloyds Bank Charges

Lloyds Bank Charges

Recently received a £60 fine for going overdrawn with my bank, who have refused to repay the money.

This should be an interesting case as its receiving a lot of publicity at the moment.
I’ll be using the blog to document my progress.

Links:
Lloyds Bank
Moneysaving expert

Its not much in my case, but it costs £30 to go to county court plus it doesn’t take £60 to write me a letter telling me I’m overdrawn.

We’ll see how it goes.

google maps polyline creator

google maps polyline creator

I spent about 20 searches today trying to re-locate, the Google Maps polyline creator tool.

As well as letting you create a polyline realtime and on the fly, it is the only resource on the net which is able to give the polyco-ords interracticveley, in a hh.mm.ss.mm format, Geo RSS format.
> googlemaps polyline creator.

This facility will allow you to grab www compliant co-ordinates from anywhere world-wide.

Effect of Google Map on Site Metrics

Effect of Google Map on Site Metrics

Just out of interest, I’ve taken a comparison of the key indicators from before and after Googlemap’s addition into the header of my blog for the New Zealand blog.

The effects highlighted on the scan.
Effect of Google Map on Site Metrics

The main indicators are Time and length of session, also, depth of session, all of which are showing good increases.

The main entry point for the site is the Netherlands blog. From Amsterdam, mostly, it’s popularity stemming from the drugs and sex references in blogs and pictures such as the sex museum and the Hash museum etc.

Also another popular entry point was the Nessie blog complete with picture of Nessie which used to come up 1 or two on googleimages.

Considering the header is only present on the main blog, and I have these blogs, and another 3 blog main pages on this site for which it is not included, these stats have further to go..