Reverse Engineering the Nonsense. #marketing #coco #eureka


It looks daft doesn’t it…’s either bats**t loonball or someone has just done their job very well indeed…. personally I think it’s marketing genius. This is how I imagine the phone call went…..

[Eureka] – “Hi, is that Coco? It’s marketing dude at Eureka here, we’ve got this idea to sell these vacuum cleaners.”

[Coco] – “Go on, I’m listening….”

[Eureka] – “Will you walk down the high street while one of our other dudes vacuums the street? We’ll give you 10% of the sales”

[Coco] – “Deal! Can I wear what I want? If I’m gonna look mad I might as well do it in style….”

[Eureka] – “Deal!”

(Disclaimer: The above is ALL MADE UP)

Back of the Beermat later….

Facebook views as I took the screenshot: 15,777,263….. nice.

One percent convert to sales? A long shot but hey, it’s madness this morning.

So 157,772 sales at $219 as let’s be honest you want the one that Coco get’s someone to clean the street with…. $34,552,205.97. Nice.

Coco walks about with $3.4m in her back pocket (assuming the getup has pockets).

Not bad for an hour’s work, a bit of mockery on Facebook and Youtube, so odd headlines about you but hey, the exposure is priceless. Eureka have saved a fortune on Youtube CPM fees and a full marketing campaign.

That doesn’t even take into account the outfit and what the baby is wearing. Now if you could scan the image into an app and find out about it…… Oh Kim’s working on that already….


How to run R scripts from Clojure – #clojure #r #datascience #data #java

An interesting conversation came up during a tea break in London meeting this week. How do run R scripts from within Clojure? One was simple, the other (mine) was far more complicated (see the “More Complicated Ways” section below).

So here’s me busking my way through the simple way.

Run it from the command line

The Clojure Code

Using the package gives you access the Java system command process tools. I’m only interested in running a script so all I need is the sh command.

(ns rinclojure.example1
 (:use [ :only [sh]]))

The shfunction produces a map with three keys: an exit code (:exit), the output (:out) and an error (:err). I can evaluate the output map and ensure there’s no error code, anything that’s not zero, and dump the error or if all is well send out the output.

(defn run-command [r-filepath]
 (let [command-output (sh "Rscript" r-filepath)]
   (if (= 0 (:exit command-output))
     (:out command-output)
     (:err command-output))))

The R Code

I’ve kept this function simple, I’m only interested in running Rscript and checking the error code. If all is well then we show output, otherwise we send out the error.

The now preferred way to run R scripts from the command line is the Rscript command which is bundled with the R software when you download it. If I have R scripts saved then it’s a case of running them through Rscript and evaluating the output.

Here’s my R script.

myvec <- c(1,2,3,2,3,4,5,4,3,4,3,2,1)

Not complicated I know, just a list of numbers and a function to get the average.

Running in the REPL

Remember the error is from the running of the command and not within your R code. If you mess that up then those errors will appear in the :out value.

A quick test in the REPL gives us…..

rinclojure.example1> (def f "/Users/jasonbell/work/projects/rinclojure/resources/r/meantest.R")
rinclojure.example1> (run-command f)
"[1] 2.846154\n"

Easy enough to parse by removing the \n and the [1] line which R have generated. We’re not interacting with R only dumping out the output from it. After that there’s an amount of string manipulation to do.

Expanding to Multiline Output From R

Let’s modify the meantest.Rfile to give us something multiline.

myvec <- c(1,2,3,2,3,4,5,4,3,4,3,2,1)

Nothing spectacular I know but it has implications. Let’s run it through our Clojure command function.

rinclojure.example1> (def f "/Users/jasonbell/work/projects/rinclojure/resources/r/meantest.R")
rinclojure.example1> (run-command f )
"[1] 2.846154\n Min. 1st Qu. Median Mean 3rd Qu. Max. \n 1.000 2.000 3.000 2.846 4.000 5.000 \n"

Using clojure.string/split will give us the output in each line into a vector.

rinclojure.example1> (clojure.string/split x #"\n")
["[1] 2.846154" " Min. 1st Qu. Median Mean 3rd Qu. Max. " " 1.000 2.000 3.000 2.846 4.000 5.000 "]

There’s still an amount of tidying up to do though. Assuming I’ve created x to hold the output from the Rscript. Firstly split the \n’s out.

rinclojure.example1> (def foo (clojure.string/split x #"\n"))
rinclojure.example1> foo
["[1] 2.846154" " Min. 1st Qu. Median Mean 3rd Qu. Max. " " 1.000 2.000 3.000 2.846 4.000 5.000 "]

If, for example, I wanted the summary values then I have do some string manipulation to get them.

rinclojure.example1> (nth foo 2)
" 1.000 2.000 3.000 2.846 4.000 5.000 "

Split again by the space.

rinclojure.example1> (clojure.string/split (nth foo 2) #" +")
["" "1.000" "2.000" "3.000" "2.846" "4.000" "5.000"]

The final step is then to convert the values to numbers, forgetting the first as it’s blank. So I would end up with something like:

rinclojure.example1> (map (fn [v] (Double/valueOf v)) (rest (clojure.string/split (nth foo 2) #" +")))
(1.0 2.0 3.0 2.846 4.0 5.0)

We have no referencing to what the number means, if the min, max, average etc. At this point there would be more string manipulation required and you could convert them to keywords or just add your own.

More Complicated Ways.

With the R libraries exists the RJava package. This lets you run Java from R and R from Java. I wrote a chapter on R in my book back in 2014.

It’s not the easiest thing to setup but worth the investment. There is a Clojure project on Github that acts as a wrapper between R and Clojure, clj-jri. Once setup you run R as a REngine and evaluate the output that way. There’s far more control but it comes at the cost of complexity.

Keeping Things Simple

Personally I think it’s easier to keep things as simple as possible. Use Rscript to run the R code but it’s worth considering the following points.

  • Keep your R scripts as simple as possible, output to one line where possible.
  • Ensure that all your R packages are installed and working, it’s not idea to install them during the Clojure runtime as the output will become hard to parse. Also make sure that all the libraries are running on the same instance as your Clojure code.
  • In the long run have a set of solid string manipulation functions to hand for dealing with the R output. Remember, t’s one big string.


Time Critical Offers 101: Watch @garyvee #smartretail

A short post but an important one. It’s one of the most interesting plays I’ve seen to push a time critical offer. And it’s an interesting one to break down a little bit. So, in the great Gary Vaynerchuk tradition let’s get micro on this a little bit.

Buy My Stuff, In Exchange I’ll Give You My Time

So to push a two hour conference here’s the deal, you buy two cases of wine, selected by Gary, for $479.99. There’s no “buy tickets to this event”, no GetInvited or EventBrite links to buy access (and giving another supplier revenue). It’s a simple buy this and you’ll get what Gary is offering, a place at the conference.

Time critical offers are a mix of components. Get them right and you can measure the success:

  • An item, could be an appointment, a session or a stock item. In this case it’s wine.
  • A payoff: money off, free gift or access to something scarce. Here it’s Gary’s time.
  • A time limit. Here’s it’s the day of the conference, October 14th. Assume that with the audience size (see on the image it’s viewed over 206,000 times) that the offer will sell out beforehand. Scarcity accelerates demand.
  • A clear outline of the overheads involved, more on this in a minute.

We now have the elements of a formula:

Item retail price * available = total potential incremental revenue

Not a lot to it really…..

$479.99 * 200 = $95,998

Not bad going. A call to action and incremental revenue. Perfect. At a guess there’s a clear 30% profit margin once you take off sales tax, salaries but there’s no room hire or, I’m assuming, paying Gary to active for two hours (and the rest). Overhead reduction means profit increase.


The scene is simple really: know your audience, know your stock and know your numbers. The time frame it critical, there are customers who want your product and don’t want to lose out.

Find them by the medium that they consume (Snapchat, Instagram, Facebook and Twitter etc) and deliver the message. If you can personalise it then even better, that takes effort though.

In my opinion Gary executed it perfectly, the results though will be in the point of sale. That’s the measure.

Saving the Stylist Time. Dappad is ripe for machine learning: @dappad_official #dragonsden #toukertime

It’s not often I watch Dragon’s Den and get a little bit exited. Okay I kind of knew that investment wouldn’t be on the table but the opportunity is. What concerned me was that it’s Erika’s gig, she is the stylist, the brand and that brings it’s own problems as growth happens.

Time is the main metric

Throughput of orders and recommendations takes time. The three boxes a year is very similar to Tesco’s “four Christmas’ a year” concept for Clubcard vouchers.

If you reduce the time and you put more orders through. Doing it on your own is possible but growth can only be taken the point of the number of boxes you can put together in one day.

So if we can find a way to save time we can process more. And there are two key aspects that will make that happen: customer preference data and product attribute data.  If you can marry those two then you are on the way to improving process. I don’t know how Erika is doing it right now, from the pitch it sounded like it was all a manual process. I could be wrong.

Machine Learning Can Help

The main focus here is to get machine learning to automate the selection process for Erika, some form of match making algorithm, the who-gets-what selection that gives a list of preferred items to to box.

The final say is with Erika, not the algorithm, and that’s the important part as the customer is still paying for a personal service so there needs to be involvement. Machine learning aids the process but does not take over.

Measure Everything

Peter Jones main beef was over returns which is a reasonable concern. We know what products are going out (from our theoretical system) and we know that some products are going to come back. This becomes a self learning system, items that worked and items that didn’t are fed back into the system so the recommendations can improve.

Be certain of one thing, you will never have a perfect prediction but you can feed as much data back into the algorithm to ensure that your error rate starts to reduce. Once you are increasing certainty then you are reducing the chance of returns. That starts to increase the value of the customer and therefore increases the bottom line.

The matter of held inventory was also an issue, using an automated recommendation there’s a process that could, over time, minimise the stock holding by Dappad and just be able to order in a just-in-time basis. Automate the recommendation across the user base, order from the suppliers required quantities and then box appropriately.

Summing Up

There’s nothing here that I have presented that’s out of the ordinary nor anything that would worry me as a customer. It’s just taking a look at the supply chain process and seeing what could be improved with a little automation and algorithmic learning.

The questions in my head right now:

  • If you introduced 4 boxes a year instead of three what’s the impact to turnover?
  • Can you use Zara supply chain learning to Dappad and get down to near zero stock?
  • Would the introduction of some form of artificial intelligence or machine learning reduce the returns by 30%? If so what’s the financial uplift?
  • Can you replicate to different bands of customer: low spend, mid spent, luxury markets.

Ultimately all five Dragons passed on Dappad and for once in my life I actually think that Touker Suleyman missed a trick here….. no #toukertime this time.


New Look vs Zara – They don’t compare. #smartretail #data

So I had a lot of fun talking loyalty, data and vouchers and generally dissing social media at Smart Retail last week. And while I enjoyed Adoreboard’s presentation I can stay silent no more, there’s one part I don’t agree with and it’s all to do with that slide on fashion retail.

The original blog post is here, it’s worth a read as it’s important to the context on what I’m about to say. Emotional metrics are fine, I’ve got nothing against that but they are not to be compared with others in the same space.

So what follows is merely my opinion but with some more numbers to back up my assumptions.

Why Does It All Matter?

I see a lot of these comparison reports. And like a data trail they are left on the internet for all to see. Now then, these findings are open to discussion but they do have impact in some quarters.

Take JP Morgan’s post on Bitcoin being fraudulent. The cynic in me sees it like this. JP Morgan are investigating blockchain technology for a long time, one which Bitcoin is built, so why diss it. Perhaps in the knowledge that if you do that then the price will drop. Markets are driven by emotions. So after the post the price of Bitcoin dives for a very short period of time, guess who had the highest purchase volume….. JP Morgan. I’ll let you derive your own conclusions from there, I have my own.

Same thing applies here, when you are talking about five fashion brands well valuation matters. And as markets are emotionally driven it can do as much harm as it can highlight a product. Don’t think posts have no ripple effect, they do.

Nothing gives you the fear of responsibility than a complete stranger walks up to you at an international conference saying, “Hi, I subscribe to your blog”.

One Dimension isn’t enough

Twitter data is dreadful, that’s the plain and simple truth of the matter. I’ve done enough of it over the last seven years to know. I personally can’t value it as a single data source. As well as that the quantity of data to get insight from, well the more the merrier. From the article “We analysed over 6,000 mentions of five of the leading online fashion retailers”, that’s not a lot of tweets and I wondered what day of the week, what time of the day etc etc?

As we don’t know the percentage mentions of those 6,000 tweets we don’t really know the true value of those scores. Were there 70% mentions of New Look and only 10% of Zara for example. These kinds of breakdowns need to be reported so we get the balanced view. Was the score weighted according to how many tweets were ranked against each retailer….. I ask a lot of questions.

The simple upshot it that you’ll get results from a small data sample but I’d like to see something over a million, twenty million or a hundred million tweets.  And don’t give me the cost and processing power, that’s utility stuff and right now it’s cheap. Many knock Hadoop now but it’s the first thing I’d go for in something like this. And it wouldn’t take long either. I’ve done sentiment studies with 8 million tweets and they were processed in just over 40 seconds.

So let’s introduce a second measure. There’s a few to choose from, I’ll go through each here. I need another metric to go against. In fact I’ve got three: the number of Twitter followers that brand has, the number of Facebook page likes and, finally, the number of physical stores.

Reverse Adoreboard against per 1000 Twitter Followers

Firstly, I know what you’re thinking, “what’s a reverse Adoreboard“, well the index score gives the positive emotion index. I want the complainy whiney index version of that…. it needs a nicer name so it’s a Reverse Adoreboard. I’m assuming here the score is based on 0-100 which is interesting in self as it means the top fashion retailer is still below 50% in customer satisfaction. I digress….. a reverse score is 100 minus the Adoreboard score.

The 6000 tweets is fine, what we don’t know is the number of followers each brand has. Well I made a cup of tea and found out. Once we have then then we can find out the RA per 1000 followers. My calculation was easy enough.

Reverse Adoreboard Score / (Followers / 1000)
Retailer RA Twitter Followers RA / T1000
Top Shop 58 1,330,000 0.0436090225
Zara 76 1,270,000 0.0598425196
Asos 64 1,040,000 0.0615384615
Boohoo 70 482,000 0.1452282158
New Look 55 362,000 0.1519337017

When you rank by the per thousand score from smallest to largest this changes the standings quite significantly, when you balance the negative score per thousand twitter followers for the brand then New Look actually come out bottom and Zara came out second.

Reverse Adoreboard against per 1000 Facebook Page Likes

Okay that was Twitter, let’s look at Facebook while we’ve got some numbers to work with. Using the same Reverse Adoreboard score how do the retailers stack up RA per 1000 Facebook page likes?

Retailer RA Facebook Page Likes RA / FB1000
Zara 76 25,907,851 0.00293347371
Top Shop 58 4,277,568 0.01355910648
Asos 64 5,035,399 0.01271001563
New Look 55 3,426,041 0.01605351483
Boohoo 70 2,534,629 0.02761745407

Fashion retailers get far more attention on Facebook than on Twitter, I think that’s important to point out. The other interesting fact here is that Zara’s presences is 1.69 times more than the other four combined. So when you run the RA score against the page likes then Zara just runs ahead of the competition.

I have to be careful here as the RA metric really applies to Twitter users and not Facebook ones. You’d have to run the study again on Facebook customer experience data to get a better idea but something tells me that Zara would still come out on top but that’s a gut feeling and not to be trusted. You need the data.

Reverse Adoreboard against per 1000 Physical Stores

Asos and Boohoo don’t get counted here as they don’t have physical stores but are purely online.

Retailer RA Physical Stores RA / PS1000
Zara 76 7,000 10.85
New Look 55 1,160 47.41
Top Shop 58 500 116.00

This is really as a guide, online and offline customer experiences are different beasts. A better gauge would be refunds from point of sale, there’s a good chance that complaints aren’t actually recorded but the reaction is merely dealt with.

In terms of Zara’s high RA score it comes out highest based on the number of stores that it has. I’d expect to see that. Even if there was a 10-15% tolerance in the scores Zara still comes out on top. As Zara’s core business is not online but in store then it should come as no surprise.


From the day I read the Adoreboard blog post I’ve never agreed with the results. What I have presented here, while not perfect, is an alternative view with extra data points. It’s only when you introduce a second metric that you can drill down into the results and get better insight.

Each brand performs well in their own way. You’d expect Asos and Boohoo to nail the online space as it’s their core business but they do a good job of staying middle of the road in terms of performance. For my money both Zara and Top Shop are doing a better job of New Look in terms of balanced ranking on Twitter and Facebook.

The Adoreboard index is fine, it ranks emotion but it’s only a single view in my opinion. Which is fine when one brand role up and want to see the emotional responses for their own brand. Once you bring in competitors then the results are very open to interpretation. As a blog post it is good. As a system it’s good, please don’t take this as me knocking Adoreboard because I’m not. I’m exploring the meanings of the original blog post that I disagreed with. As there’s context missing it’s always going to be an opinion whether things are right or wrong.

Best course of action: run the whole analysis again with a million tweets.


The original Adoreboard post:

Google Sheet of the raw data and calculations:

Speaking at Smart Retail – #retail #customerloyalty #data #coupons #vouchers

The SmartRetail Conference is taking place on Thursday 28th in the Culloden, Belfast. I was asked to talk about customer loyalty and my experiences with loyalty based discounting, something I covered with uVoucher.

So, day off booked. Slides are done.


And yes, I will talk Tesco Clubcard and the supply chain wonders of the Zara fashion chain. It’s all about the data.

This is a great opportunity for anyone in retail to network and learn some new things. You can pick up tickets on the SmartRetail website.



Loyalty as currency: #blockchain meets @taylorswift13 meets @BurgerKing #loyalty

Loyalty is really about customer control. It’s about crafting, controlling and defining the conversation. The most control of customer loyalty the customer actually has is by leaving and going to another brand.

Loyalty plus Currency is about controlling the customer conversation, not just about whether you can interact in certain ways with the brand but with the addition of how the customer can buy with you, that’s where there’s power.

Taylor Swift and Ticketmaster

In the last few days Taylor Swift has come under fire from some quarters about rewarding loyal fans with access to tickets to concerts. Plain and simple to put the tickets into the hands of fans, not the ticket touts or bots.

The first stage of loyalty, measure the feedback from the customer. Rewards are based on interaction with the brand. With no way of measuring the conversation (whether that’s by a loyalty card at the point of sale, social media or coupons doesn’t really matter, what matters is that it’s traceable and measurable).

I think the criticism from the media and other bands is too harsh, I see where the angle is in all of this. If customers get annoyed with the brand (Taylor Swift) when it’s outside of the brand’s control, in this case touts and bots, then it harms the brand and causes long term value damage between the customer and the brand.

The partnership with Ticketmaster changes that, the vendor does the monitoring and it’s case of tying up the customer to the loyalty (social media) and a determining factor on whether that customer is loyal or not. If the scoring is good then you can book a ticket, if not then you move down the priority queue. Basically if you can show loyalty to the brand, the brand will be loyal back.

It’s not perfect but it’s an improvement.

There is nothing new here. It’s just that it’s Taylor Swift so it’ll come under the media microscope. Personally, I think it’s a perfect move under the circumstances. Any artist wants to perform to true fans of their work, not those merely entitled by the size of their wallets.

The Tout Problem

In days gone by counterfeit concert tickets were easy got. I never knew they existed until I went to see my first proper gig in 1986 when I saw Level 42 at the Manchester Apollo (Mike Lindup lost his voice that night but it left it’s mark that I’m still a musician thirty years later).

So, these touts were hanging around the front doors selling dodgy tickets, no one bats an eyelid. Some chance it…. I have my ticket and I’m not letting go of it.

How touts operate now in the internet age, buying up loads of tickets and selling them at crazy markup. Bots make the whole process worse by purchasing at far faster speeds than any human can. The fans lose out, the artist isn’t happy.

The issue is that the currency is the same over whether you’re a tout or the most devoted fan of an artist. It’s sterling, or the US Dollar or whatever you’re paying in. Brands don’t control the global currency markets.

Solving the Artist & Customer Purchase Issue

Firstly you need to control the ticket creation, verification and authenticity of the tickets for a concert.

Dare I say it, I think blockchain might be the answer. The ledger would act as a historic and signed list of tickets generated. On concert day and you get your ticket scanned the confirmation system would lookup on the ledger for confirmed blockchain keys, if you’ve got a confirmed key then all’s good. If it’s a bad match and there’s nothing in the chain, well it certainly wasn’t generated in this ticket run for this gig, back to the car park with you.

If you keep the blockchain ledger locally (ie within the venue on concert day) then you’re reducing the round trip from scan to confirmation. I’ve worked with mobile scanning ticket confirmation systems that work over the air to internet servers, they’re slow and connections break frequently. Local servers, reduce complexity and increase speed.

So where do Taylor and a Whopper collide?

If you can control the loyalty and who gets what, whether it’s a supermarket like Walmart or Tesco or a global artist like Taylor Swift, that’s one thing. To control what the customer buys with is something else.

Interestingly Burger King in Russia are tying up the customer loyalty from the opposite side, the currency. They’re trailing a WhopperCoin which is a Bitcoin/Etherium like token. These tokens can be bought, sold and traded like bitcoins and can exchange between the brand (Burger King), any of BK’s partners and even better customers can sell currency between themselves. It has value.

Loyalty card points have value but it’s usually fractional and in some places come under banking and finance rules. If you’re running your own loyalty scheme with points it might be worth checking…. you’re effectively introducing liquidity into a market.

So, WhopperCoin for currency with limited use to loyal customers. And a ledger based control system for ticket/transaction authenticity.

Taylor Swift merges with Burger King, kind of….

Not literally but the concepts could. What if we’re to say that Taylor Swift fans can earn fractions of TaylorCoins for social media support, blog posts, full youtube views of videos and so on. These coins can be sent, received and traded between each other and also used to buy blockchain enabled concert tickets.

At this point, as I see it, if a tout or a bot wants to purchase Taylor’s tickets for a show they have to be in TaylorCoins and anyone converting huge amounts of dollars into TaylorCoins would send off alarms in the system. When the brand has control of anomaly detection at this scale it means the brand can act by either declining the transaction or other means.

The tout at this point will stick out like a big red flag in a very strong wind. When you control the currency, you control the brand. Touts can be turned away and early in the process. At this point the only way to create tickets would be to create fake ones. And as they are in the blockchain they are easily authenticated against all the other tickets in the ledger.

Ticketing is Big Money

Tours usually break even, unless you’re the Rolling Stones. And that’s why the artist/tout relationship has never been good and it never will be. One side is gaining huge volumes of money over the artist’s reputation and brand.

I’m skimming the surface of a bigger idea here I think but on paper the artist controlling the ticket ledger with blockchain and also controlling the currency of how a customer interacts with the artist provides two key steps to reducing the possibility of fraud, counterfeit goods and preventing real fans from getting access to their idols. Win, win, win and win all around.

Give it five to fifteen years….. everyone will be going to concerts this way. Perhaps.



The Gig Economy Part 2 – No, I’m not doing a drinks delivery startup….

My blog post yesterday was outlining my thought processes of an alcohol delivery service. It’s obviously hit a delighted nerve in the readership (thank you, both of you).

Yesterday I received quite a few messages from folk suggesting I should do such a thing. “That’s a great idea Jase, you should do it!”, “We had this very problem over the weekend!”, “Your delivery charge is too low, I’d pay £3 to get it to my door!” and other communications ended with exclamation marks.

I’m not doing it. There are plenty out there doing such a thing, I named one yesterday, Hungry House, if you’re in the UK. If you want to do it, you have my blessing, use the original post as a starting point.

Winner on branding though…. Saucey, just make it as chic as you can. Bawse as a Service.

The Gig Economy, doing initial calculations as a #startup – #gigeconomy #uber #deliveroo #justeat

I rarely have dreams, but I do get words in my head when I wake up. Though, I don’t know why I woke up with the initial thought of, “Alcohol delivered to your door”. Everything is great at work, my life is good and I’m not on the lookout for a bottle of anything. It was an idea that just entered my head and when that happens I note it down and have a look at it.

So, let’s tear it to bits, Buckfast on Demand.

The Idea

Connecting off sales to customers. That’s all it’s about. I could call it the Uber of Booze or the Just Eat of Binge Drinking* or even the Hungry House of Liver Damage, but right now it’s just about getting something from a retailer to a consumer. It’s just delivery.

So “as a customer I want to order a couple of bottles of Jacob’s Creek CabSav and have it delivered within the hour.

* Interestingly, Just Eat dropped their alcohol delivery service. And if I was an over excited entrepreneur then I’d be asking one simple question, “WHY?!

It’s All About the Numbers

Once you’ve figured out some key numbers then you can look at whether to proceed. I need a calculation to figure out how many drivers I’ll need to service an area of n population.

Where I live has a population of about 12,000 people. Off the top of my head I can think of four off sales sites that can supply. Notice I’m keeping away from the how-do-we-keep-the-stock-items-up-to-date-on-the-app argument, that’s only a discussion once I’m past this point, is it really worth doing?

I’m estimating that 2% of the local population will use the service twice a week. The peak will be two days of the week, Friday and Saturday over a six hour period 5pm-11pm. Let’s assume that 80% of the order volume will be on those two days. I’m trying to find the number of delivery drivers I’ll need to service the local area.

12,000 x 2% = 240 x 2 = 480 x 80% = 384 / 12 = 32 orders per hour on peak time.

With an average trip time of 20 minutes per trip and we are looking to fulfil orders within the hour I need 10 drivers in theory. Now that doesn’t account for multiple orders originating from the same retailer. Pick up time is reduced, in theory I could save 20% of the time. Also order patterns are never uniform, they may bunch up right after work, between 5:30pm-6:30pm, or just before closing at 10:45pm.

I’m going to settle on 8 delivery drivers.

To Employ or Self Employ that is the question?

This is where things get sticky. Especially if you’re a Guardian reader. You want to pay the workers fairly but that does come at a cost. Right now my mini operation covers one town. Settling on a minimum wage rate to start off with let’s have a look at the numbers.

First assumption, the workers are over 25, so the minimum wage is £7.50 an hour. it’s going to be a part time gig as my peak times only cover the two days. These bits would need work and refining. Second assumption, employment with this company would have the expectation that this is a second job.

The calculation I’m using is minimum wage multiplied by the number of contract hours, multiplied by number of drivers (8) and doubled for employer contribution costs and so on.

7.50 x 16 = 120 x 8 x 2 = £1,920 a week

Now that’s revenue my service needs to earn just to pay the delivery workers, that doesn’t take into account my costs, marketing, hosting, up front development costs, payment gateway percentages and so on.

Can We Get the Revenue to Balance Up?

We’ve already calculated that 480 orders a week is a working average based on 2% of the population. Assuming that the order total is £16.90 + £1 local delivery, that gives a per order total of £17.90 giving us a total retailer revenue value of £8,592 a week. We, though, are nowhere near out of the woods yet.

I’ll be taking a percentage off the retailer, that’s my fee for handling this whole pickup/delivery operation. Ballpark figure, 7%, once again this could change especially if it’s wipes out the profit margin on the retailer’s side.

£8,592 x 7% = £601.44

So on paper the 8 delivery driver plan is out of the window. Now I could be crafty and only use 18-20 year old drivers and reduce the minimum wage down from £7.50/hour to £5.60/hour.

5.60 x 16 = 89.60 x 8 x 2 = £1433.60 a week

A decent reduction but still nowhere near what I’d need for a break even. And the thought of “if we raise enough runway” hasn’t entered my head yet. I could have all the runway I wanted and still burn through it at a rate of knots.

Now for the Flipside

Now then, that’s if I was employing them. If they were self employed then I can avoid all that and pay per trip. So we need to look at the average sale again.

I said £17.90 for my two bottles of CabSav plus delivery in the local area. Once I take the mythical 7% I’ve got £1.25 revenue per sale. I agree to pay the driver 65% of the revenue or 81p – now it doesn’t look worth it at all but the more deliveries you can do the more you’ll make. Now it becomes a sport to fulfil as many orders as you can. And if you’re really smart you could make over £300 in two days just on the peak orders.

For me as the entrepreneur then the self employed driver model just wins hands down. They’re making some money and I’m making some money (currently 43p an order). As a model it’s wide of the mark.

The model is too basic

Right now everything is based on averages. These are baseline assumptions now I need to look variations.

Seasonal – Christmas, New Year, Wakes, Births, First Communions, Confirmations and right down to “it’s wine o’clock” – you’d expect peaks of order volume and you’d have to adjust manpower to suit.

Event Driven – Any sporting event…. get the beers in. Done.

Order Type – So far I’ve worked the model off one drink, wine and that’s a low order retail volume. What if a customer orders six bottles of vodka instead? £12 a bottle is a £72 + £1 order (£5.11 revenue with £3.32 to the driver and £1.78 to the startup). You may want to start an Ambulance as a Service at the same time if customers are ordering six bottles of vodka regularly.

Ultimately I’m curious on this baseline 2% figure which I outlined at the start. What if it was 3% of the population of 12,000 ordering?

12,000 x 3% = 360 x 2 = 720

Your order volume increases 50% to 720 orders a week and theoretical retailer order book from £8,592 to £12,888, with your 7% becomes £902.16 for the week.

Scaling Up

To make it you need to operate in as many cities as possible. Hundreds of them, with as larger populations as possible. Rolling out in small areas is usually not representative of how the model will behave. Where I live is certainly not the target market I think, even the next major city wouldn’t have the volume.

So, even though I had no notion of doing such a business, I’d pass on it. Too variable to my risk aversion. Though it’s been an interesting exercise on the numbers. And I’m not saying this is right, it’s very open to interpretation and would need another four or five iterations and kicking about a bit before thinking it was worth pursuing.

Further Reading

If you’re interesting in the modelling aspects which I’ve lightly scraped up above then it’s worth looking at John Adam’s book, “X and the City” which models various city scenarios, some sensible and some not. Here’s an Amazon link to it.

If you’re really interesting in City Modelling then check out Witan which is Mastodon C’s city modelling platform. (Not a paid endorsement by the way).

Tell Me, What Did I Miss?

So, I’m not a genius, I’m not a maths whizz. This is a mix of simple numbers, common sense and a calculator. So if you think there’s anything that I’m way off the mark with then I’m all ears, feel free to leave me a comment. I’m here to learn from you just as much as you’re reading this to learn from me.

If we can learn from each other then perhaps we can improve things all round for the better.

….So how I will I get my beer?

I’ll drive to the shop, it’s just down the road.





The Summer Reading List – feat: @tarah @holdenkarau @HarvardBiz @mattwridley

With two weeks in August I’ve learned some new things. The exchange rate will remain against us but it doesn’t change our resolve, we’ll jump on aeroplanes and go to Spain, we just don’t return home with the donkey and the sombrero now. And yes, Ryanair purposely do hard landings to save on tyre wear and shave turnaround times.

The UK could learn a thing or two on how to charge for public transport. Buses and trams are cheap and people use them. Alicante town and Altea are lovely.

Benidorm is what you make it, it’s not all the mad drinking that the UK media play out. During the high season mobility scooters are a lot less common, in October it’s mobility gridlock.

Finally Belfast International’s international arrivals could do with a lick of paint and keep immigration on the same level (ie, the ground floor). Just saying, it’s depressingly grey to come back to. Heck knows what out of country visitors make of it.

Aside from all that it’s a good time for me to catch up on reading as I don’t get a huge amount of time. So here’s what was in the bookshelf, in the carry on bag and in my shoulder bag. Never be without a book…..

The Evolution of Everything: How Small Changes Transform Our World (Matt Ridley)

A surprise find in a small newspaper/bookshop in Benidorm. The book is broken up in the different areas of science, philosophy, business, technology, economics and so on. And it’s a great read, plenty of new things to learn that I wasn’t aware of. It’s not a technology book but there are some very interesting points to learn from.

Around about 23 people came up with the idea of the lightbulb, during the same period of time as Eddison did. So how to does a company/person claim more patents on “inventing” something when the idea is usually shared?

Find it on Amazon UK

HBR 10 Must Reads 2017 (Various)

I only ever find HBR books in airports, I only bought it for one article in reality about the ownership and curation of Artificial Intelligence models. The other articles are great too.

Find it on Amazon UK

A Truck Full of Money (Tracy Kidder)

The story of Paul English who was one of the founders of Kayak. It’s a read about English, not about Kayak though that does feature in and out of the book. It’s a good grounding in his thought process, which can be all over the shop (so not just me then). Sometimes the writing tends to go on a bit, I think it could have been shorter.

Find it on Amazon UK

Women In Tech (Tarah Wheeler)

I bought this for my not-so-wee-one but it’s taken permanent residence in the living room table for everyone to read. While Tarah has written and curated a brilliant book on women in tech the information is really a must read for anyone wanting to be in tech. Like I said in a previous post, I wish I had this book thirty years ago.

Find it on Amazon UK

High Performance Spark (Holden Karau and Rachel Warren)

I’m blessed, I get to do some interesting Spark work at Mastodon C but finding good reading material on the subject can be hard. The general rule of thumb is if Holden has been involved then I read it.

The is about getting the most out of Spark from SparkQL, ML and how to get the best performance out of RDD’s. The code is in Scala as you’d expect but that shouldn’t be a worry if you use Python, Clojure or Java. You’ll figure it out, that’s what you’re paid to do.

Find it on Amazon UK