Hard News: ffunnell Up!
83 Responses
First ←Older Page 1 2 3 4 Newer→ Last
-
And yes it is remarkable that a small group of blogs can have an audience of 1.3 million (which is substantiallly larger than that of all NZ newspapers).
I'd replace the word remarkable with the word unbelievable.
-
Daryl,
The number is the number.
And Ii there is an error in the methodology it will not be by an order of magnitude. Via Google Analytics Scoop's audience alone is close to 500k as is Geekzones. The fact that the combined audience of the network is close to 1.3 million is not at all surprising in light of that fact.
As explained they are not all NZers. Geekzone has a big US audience as does Scoop.
However the NNR audited NZ audience size on a monthly basis is still larger than all newspaper circulation figures except the Herald and the Sundays. That may be surprising but it is also true.
al
-
Oh, and well done the Chiefs.
On their behalf, thank you. But it was bloody close and a little strange when the fog rolled in. We were sitting behind some very obnoxious Hurricanes fans, so the final result was needed.
-
There it is. That's what bothers me. If it's a way of increasing your advertising muscle, fine - who cares?
That came out all wrong. Nasty. I meant: if you do it to increase you're advertising muscle, fine - but it's a bit like laws and sausages, ie not something that has a reader I feel like I need to know. If the connection however is of a tangentially editorial nature, as Alastair suggests with the comparison with the US experience of misunderstanding, then I'd be more interested in the discussion.
-
Alaister,
Like I said, I respect what you're doing and applaud the idea of a locally owned online advertising outfit. And I assume you're working/debating in good faith. But from my own experience I'm aware that the number of people reading blogs in this country is microscopically small.
I work in a university, have friends in arts, media and technology industries, key cohorts for readership - and because New Zealand blogging is a preoccupation of mine I raise the subject a lot and quickly learn that very, very few people have heard of sites like Kiwiblog or Public Address. By contrast, I have never had a conversation about media in which people had never heard of the Dominion-Post or the New Zealand Herald - so it's hard to see how the former could be more popular than the latter.
I'm a scientist, and I'm inclined to accept statistics over anecdotal evidence. But when the statistics are this incredible - a quarter of the country reading blogs, more than the entire national population using trademe every month - you have to let common sense have a say.
One of the nice things about working in a university is that when you're confused about something you can wander down the corridor and ask an expert. A friend in Computer Science who has done work on the university web site advised:
there are many problems with the unique visitor stat. Here is one: most libraries, internet cafes, schools, universtities and businesses have a security policy that deletes cookies and other internet files when users log off. This prevents the hard drives filling up. These cookies are how unique visitors are tracked. So if you are a TradeMe junkie visiting the site twice a day every day at your local internet cafe or workplace but the cookie is getting deleted each time then at the end of the month you will show up as 60 “unique” visitors. This is an extreme example but from our in house tests it is not unusual for unique visitors to be out by a factor of 20.
My understanding is that unique visitors are fairly accurate over short periods of time - 1 to 2 days - but then get increasingly inaccurate over time as problems like cookie deletion kick in, until by the end of the month you are orders of magnitude out.
-
But when the statistics are this incredible - a quarter of the country reading blogs, more than the entire national population using trademe every month - you have to let common sense have a say.
I agree - the idea that blogs are more popular than traditional media is patently ludicrous. But that doesn't mean the numbers are wrong as such.
There is a difference between using a site and ending up on it. I can tell you with certainty - precisely because of how few readers I have, which makes my statistics very easy to interpret indeed - that a good chunk of my readership never meant to end up on my blog in the first place and got out as soon the fact became obvious, which is to say, most probably within the first second.
Google over-estimates the relevancy of blogs in a truly staggering manner (I am the first result for "memory and technology", which is insane if you look at the other results on the same page), and people find me looking for bats, for beams, for legumes, for sexual content (I wish I could help them there) all sorts of info that I simply do not deliver. So very early on in the piece I started to ignore the weekly new readers and measure my readership simply be returning readers. Which again, is not terribly accurate in that sometimes a returning reader will appear new due to browser switch or whatever, but so long as you look at a small enough unit of time - I go by the week, because I happen to publish weekly - it will give you a fair idea of the trends at least. You can also cross-reference it with the feed subscriptions to check that it makes sense.
Now, I read PA from three different computers, but I am not three people. Also, buying and reading the NZ Herald every day is not the same thing as occasionally clicking on PA or Scoop and spending one minute there every four weeks - but according to Alastair's metric those two manners of readership would be qualitatively the same. Then there are all the people mentioned above who find PA etc. by mistake and leave it within a matter of seconds. Nobody buys a newspaper and then drops it on the floor. Not even the Dominion Post, for which it would be the only appropriate usage.
By the same token, I imagine that TradeMe, while actually relevant and large, attracts a lot of people from overseas who don't know that it's a local NZ site and are simply looking for wares. There must be a ton of those every second, so the 4.3 million statistic doesn't seem out of whack. Their actual, meaningful readership is a different matter entirely.
-
There is a difference between using a site and ending up on it
I think there's a lot to be said for that Giovannni - some time last year I posted a picture of Ayn Rand on my blog, somehow it rose to the top of the google image results for Ayn Rand and I had 100+ extra visitors per day looking for pictures of Ayn Rand. True story.
I guess it's in the interests of people selling ads online to use metrics like 'unique browsers/month', but if I were paying for ads on these sites I'd want statistics like 'number of visitors/day who stayed longer than 10 seconds'.
-
(Makes mental note to include pictures of Ayn Rand in every one of his posts hereinafter.)
-
Makes mental note to include pictures of Ayn Rand in every one of his posts hereinafter
Be prepared for your comments section to get weird.
-
To be fair, are things any different in the print world? Readership and circulation figures are famously bogus. In the end, I think it's best to treat these figures as relative measures, and use them to compare sites. Trying to reliably infer what really happened from the indistinct signs and spoor in our log files is a foolish thing. But I'm sure we can distinguish my piddly blog from PA System from log file inspection, even if we don't know what readers were doing.
-
But I'm sure we can distinguish my piddly blog from PA System from log file inspection, even if we don't know what readers were doing.
One thing is comparing life for like, blog for blog (although I have issues for instance with what Tumeke chooses to regard as reliable metrics), quite another is to claim that a handful of blogs have a significantly greater readership than all the newspapers in NZ.
-
But it was bloody close and a little strange when the fog rolled in. We were sitting behind some very obnoxious Hurricanes fans, so the final result was needed.
Crusaders' shamans needed the practice. But srsly, how cold WAS it? Because it looked fairly arctic. Watching players run across the field trailing clouds of steam behind them like Thomas the Tank Engine was quite bizarre.
-
Ah, I love the smell of net stats in the morning.
Russell's paradox
It might be assumed that, for any formal criterion, a set exists whose members are those objects (and only those objects) that satisfy the criterion; but this assumption is disproved by a set containing exactly the sets that are not members of themselves. If such a set qualifies as a member of itself, it would contradict its own definition as a set containing sets that are not members of themselves. On the other hand, if such a set is not a member of itself, it would qualify as a member of itself by the same definition. This contradiction is Russell's paradoxOn the other hand it may have something to do with dynamic IP addresses. eh.
-
But srsly, how cold WAS it? Because it looked fairly arctic
'round 2C. But the sun gods are shining on us today. In regards the obnoxious Hurricanes fans, I came close to sticking their bloody electronic foghorns up some place painful. Maybe they were useful in finding their way home!
-
Watching players run across the field trailing clouds of steam behind them like Thomas the Tank Engine was quite bizarre.
Not surprising the fog came as there was clearly no wind. The steam stuck around in the air for a minute after a scrum.
I felt like someone needed to track the steam throughout the game and build up an animation showing it across time and place and then reverse engineer that to commentary:
"Ooh a mighty buildup of steam in the middle of the field, that must be a scrum. Now over to the sideline, that's a lineout. And then think streaks of steam through the middle - Nonu on the break! Then a ruck, and now the steam is back down the other way, it's been kicked downfield by the chiefs!"
You could relate it back to the game and see who was best at the exercise. Or maybe get a life, either way.
-
Having played in the web-serving space for 13 or so years, I'll state with a degree of certainty that log stats are only useful for
* determining whether your site is adequately provisioned for the number of http (and other) calls you get,
* detecting what pages people are not getting (e.g. 404) as that may indicate a mistake on someone else's page, or that you've broken something
* detecting trends as certain files (articles, images etc) become more popular
* indicators of where people came from to your site and where they went afterwards
* some other useful indicators of how people use your siteWhat they are not useful for is absolute numbers of people who viewed your site. And they're really crap for comparing sites numerically. I agree with Stephen that you can get some relative information which may or may not be useful, but there are no absolute metrics that I would trust except how hard my server has to work to fulfil requests.
-
Having played in the web-serving space for 13 or so years, I'll state with a degree of certainty that log stats are only useful for ...
Are services, such as Google Analytics and Nielsen, that use tracking code on third-party websites any different?
Nielsen and Analytics are generally very close for all the sites I know about, so whatever they're measuring, they're getting the same results.
Also, our log stats say ~85% of our traffic by volume is now MP3 files. The PA Radio podcast is a rather vivid long-tail story.
-
I get completely different results from Google and Statcounter on my Blog. I have no idea why.
-
Are services, such as Google Analytics and Nielsen, that use tracking code on third-party websites any different?
AFAIK
GA can tell you about your site's activity. It takes its own logfiles based on its monitoring of your site. Each 'page' is tagged and Google keeps count. I'm not aware that it does comparative analysis with other sites as that would require access to their 'private' data.ACNeilsen/NetRatings is sort of a black box. They use a lot of proprietary methods and no-one is really sure how it works. See their online page
Nielsen and Analytics are generally very close for all the sites I know about, so whatever they're measuring, they're getting the same results.
I've seen that, and I've seen the total opposite - I honestly don't know. In the end, comparitive analysis with other sites is not hugely useful. You can spend an awful lot of time getting to be number one in your niche and find that you've spent more getting there than it was worth.
Are your customers happy? Are there more every month? (No? What did you change?) These questions are more important. Reading logfiles or any analytics is a black art because every site is different.
Also, our log stats say ~85% of our traffic by volume is now MP3 files. The PA Radio podcast is a rather vivid long-tail story.
That says to me "buy more storage and get a better bandwidth plan" ;-)
-
That says to me "buy more storage and get a better bandwidth plan" ;-)
It says to me: "I'm glad we don't host in New Zealand"
-
Are services, such as Google Analytics and Nielsen, that use tracking code on third-party websites any different?
Fundamentally, they can't be. All their tracking code does is set cookies and request resources from the third party site that result in log files (or database entries) at the third party site. They have no way of capturing any more information than what you or I can capture with a log file and a willingness to slice and dice it. The value that Google Analytics or Nielsen provide is:
- they can benchmark against other sites
- Google Analytics can tie in with Adsense
- they do the statistical slicing and dicing for you
- they have pretty visualisations that encourage you to digAs to what can be captured:
- IP number (network and geographical location can be inferred from this with some inaccuracy)
- cookies (allows inference of returning visit, visit length, bounce rate -- it's based on inspecting cookies, and so depends on the browser accepting cookies and the user not deleting them)
- referring page (with more or less precision -- based on information provided by browser, and some browsers lie; this is inspected to extract google search terms too)
- browser type (some browsers lie)
- operating system (some browsers lie; it's possible to sniff OS type with some accuracy by low-level inspection of packets, but I doubt they do this)
- screen dimensions (relies on Javascript in the browser)
- time to serve a resource from first to last byte (connection speed can be inferred from this)Anyone who operates their own web site on their own server can do this. There is no special magic.
One of the naughty things these services do is imply a degree of precision that they don't really have. For example, if I tell you that you had 32,516 visits last week that sounds as though I have an efficient and accurate system, but actually, that's a best guess based on shagging with cookie data and the actual number could be higher or lower (and depends on how "visit" is defined too). It would be more honest to say "around 30,000, probably somewhat more."
-
It says to me: "I'm glad we don't host in New Zealand"
I'd call that a better bandwidth plan ;-)
-
I get completely different results from Google and Statcounter on my Blog. I have no idea why
Google probably has more depth to the numbers.
I look after quite a large number of sites and we use a selection of tracking systems. There are the usual logfiles and then I have some others that show splits from spiders and people since many of my sites have blog elements that are pinging external services.
I sometimes use caching systems if there is a traffic spike fr example as sometimes happens if a link is booklamrked on one of the high volume services.
All traffic is not the same however / and useful to know what the RSS feeds are doing and if any big media files are dragging along as well as "what Mark Harris said" about trends / 404s and the usual activity trails to watch out for.
I think overall traffic is of interest / repeat traffic is good. Impressions can be useful but maybe a better indicator is the number of comments and interactions.
Some of my sites have threaded comments which is a bit more like the forum approach. I'd say that would be one element I'd look for with Supermodel if Karl / Matt is reading.
I've played around with Woopra and some other alternative analytics packages but since most of my sites are open sourced there is a fairly constant stream of updates and overall if a site is growing that is a good thing for advertisers and everyone else.
-
Is anyone else a bit weirded out by the new "A Sign of Hope" banner ads that have popped up recently on PA System?
It appears to be promoting the website of some fruity new age group who believe some star is signalling the return of "the World Teacher". And as far as I can tell, it is not a pisstake.
I'm actually a bit offended by it - the assumption that as a member of the PA System community, I would somehow be interested in this new age bullshit.
-
Nope, errr, because banner blindness for the win! My stand on banners is as long as they don't actually grab my attention or perform nasty tricks to get under my mouse (sliders, we hatessss them), they can stay.
Also, very interested in reading the comments on web stats above, thanks Stephen/Mark/Jason et al.
Post your response…
This topic is closed.