BlockChain made easy

Blockchain – the latest voodoo to be cast in the IT realm with many not even understanding what it is and most not understanding the problem it tries to address.

As simply put as possible: Blockchain is a public database with private information shared between peers with a record of transactions, audit trail and authentication of users.

Can it be made easy to understand?

Well, let’s try, starting with … What was the original problem?

Essentially the issue was how could a currency be created without a bank. This creates a number of requirements … Basically: It must have a common understanding to have a value, there must also be a common ledger and record of transactions as well as mechanisms for people to spend. For transactions to be recorded, it would need a peer to peer infrastructure.

Okay, so – in order to transfer money from one person to another, we need a record of how much money they have.

Great – let’s give users a “wallet” that is available from anywhere. We then need to be able to send messages to transfer currency from one wallet to another.

Okay, now we’re cooking with gas! – but how do I know that the payment really came from User A? Shouldn’t we have some kind of signature?

Now that we have a private key in the wallet, the user can sign stuff and it can’t possibly be from anyone else. Though, we need to put better security around the payment, don’t we?

So far, so good. But while we do this, we need to ensure that the payload is unique to prevent re-spending the same currency.

Also, we have already said that this will not be stored on a server, so we need to know who has been involved in handing this package from one end of the internet to the other. In fact, instead of just handing one transaction, surely we should gather some of the transactions together into a block? While we are at it, let’s make sure the wallets are accessible on lots of nodes around the internet, that way, we don’t need to worry about (1) if the user can get to their wallet, ie. high availability or (2) that the wallet is anywhere vulnerable.

Ah, now that’s a bit better. However, this block will need to be passed between several peers in order to tell User B about User A’s payment. There are risks around malicious as well as accidental change on the way through that have not really been addressed.

Okay, so imagine this block is going to be replicated several times. We already have the encrypted and signed payloads and the block has a header so that you know what is detailed in the block as well as that it is unique.

In practice, what happens is that validators each check the validity of the block and when they have confirmed, tell everyone else that it is valid. This will mean that depending where you view this cloud from, changes how valid you consider the block to be, but over time, this improves …

This is where it gets quite technical. We use an algorithm in order to determine the validity of the block and what are called “miners” to calculate that validity. We also pay the miners in order to make a fair system that prioritises the costly computational effort.

So now we understand why we’re doing things in this way, let’s have a look at the process of what we’re doing:

  1. Start with a client, a wallet that contains keypairs, and some unspent currency
  2. You create a new transaction spending some of your unspent currency. Sign it with your private key. Your client will store a copy of it
  3. Your client starts to broadcast the new transaction through the Network
  4. Every client that receives your transaction checks whether the signature is okay, whether there are any errors, and whether you are trying to perform a double-spend. If your transaction fails any of the criteria, it is ignored by the client entirely
  5. All the clients that know about your transaction follow a similar route of broadcasting as you did.
  6. Eventually your transaction reaches some mining pools and the recipients of the transactions. The latter will see the new transaction in their wallets and store a copy of it indefinitely, but it will appear as 0 confirmations. The mining pools will see it as a new transaction and will include it in every block they try to create. They will store a local copy of the temporary blocks and give out the corresponding work to solve to their miners.
  7. The miners don’t know anything about your transaction. Their job is to crunch numbers, not to check for block validity, as that’s a task for the pool.
  8. Eventually your transaction is included in a block that gets solved. It gets broadcasted proudly through the network and everyone keeps a note of it from now on to know if some new transaction conflicts with it in a double-spend attempt. Now your transaction has 1 confirmations.
  9. The block creation process continues, and as more and more blocks build on the block your transaction gets included, it gains more confirmations. Eventually reaching 6 and more confirmations, it is considered fully confirmed.
  10. The transaction finishes its life cycle once it is spent by another transaction, meaning that its outputs can be forgotten from the “unspent” memory and disregarded for any other attempts to spend them. It will, however, remain in the blockchain for as long as people will keep track of the full chain.

Interestingly, this is not just a method of publicly maintaining private transaction, it can apply to any information, such as legal agreements or contracts.

However, before you rush out and start building your own Bitcoin or blockchain project, there are considerations about race conditions within the transaction network. It is worth learning the full story first, but this is a basic overview that demonstrates the elegance of the blockchain as a solution, as well as how wildly misunderstood it can become.

But it doesn’t stop there!

In addition to passing data, currency and contracts, you can also have self executing code – which transforms this to quite an incredible and complex organic mechanism that is continuously validated and secure. This changes blockchain into an online computer with the ability to be accessed from anywhere in the world. The fundamental concepts underlying blockchain will enable a revolution in computing. If harnessed through the right networks, with the right code, it could enable a revolution in how we perceive public and private data. If properly aligned to business processes and governance, it could enable the ability to run a de-centralised insurance company, bank, or effectively replace any type of business/institution that usually acts as an intermediary, without a human having to lift a finger.


“A blockchain is a magic computer that anyone can upload programs to and leave the programs to self-execute, where the current and all previous states of every program are always publicly visible, and which carries a very strong crypto economically secured guarantee that programs running on the chain will continue to execute in exactly the way that the blockchain protocol specifies.” — Vitalik Buterin

Are you developing Future-Tech?

What is Future-Tech?

When you look at technology available in the market place, it comes down to IT companies delivering a solution, without even asking questions. The hope is that their sales teams will be able to match your requirements closely enough to what they offer in order to introduce their product.

Generally the product will be delivered below the cost of development from scratch, then you will have to pay over the odds for their specialists to tailor to what you really wanted in the first place.

This can be cost effective, but really, any CxO or director should be asking themselves if it is always cheaper to go down this road.

Future technology on the other hand looks at where your business is going. As the name suggests, it is looking at your future and aligning the right technology to suit.

Typical technology systems that Future-Tech looks at are AI, data analytics, voice and speech recognition, optical character recognition, automatic product checkout and general  automation and integration solutions as well as bio-metric identification and advanced security solutions.

How do you use Future-Tech?

First you need to start with your wishlist – the one you put in a drawer, because you never thought it would be useful, but you wrote it because it’s nice to dream. Look at how you want to change your customer experience, your security, your tools or any aspect of your business.

Think about where your company should be, because the rest of the world is moving forward at breakneck speed  and with tech companies like Google and Amazon transitioning into different markets, it’s only a matter of time before your customers could be taken by a company who is bold enough to use cutting edge technology.

Essentially it comes down to innovate or lose out. However, it doesn’t need to be as scary or costly as starting from scratch.

You can choose a variety of options, from importing data from your legacy solutions, integrating with legacy services or even load balancing between solutions – any of which can avoid the risks of a big bang implementation.

How do you know which Future-Tech you need?

Simple; drive your innovations by analysing your current profit and loss as well as business processes. Identify any current pain points and how you can accelerate the purchase process to be as convenient as possible to your customers, possibly also consider re-purposing your staff to improve the customer experience, rather than being stuck behind tills.

Next, you need to engage with the right technology delivery company that can talk to hardware development companies and deliver software that can compliment the solution as well as work with your staff effectively.

Consider companies that will get the right information from your teams as well as train them on the solution once it has been developed and tested. There should also be continuous demonstration through the delivery cycle to ensure the solution you get is exactly what you want.

Avoid off the shelf solution providers, unless they really do offer what you need and make sure you work with a company that will make anything possible for you. Ensure your technologists of choice understand how to stick to your budgets and break up the delivery into milestones so that they have to keep proving themselves throughout.

Where can Future-Tech get you?

A great example of where future-tech is revolutionising the shopping experience is the automatic checkout solution that is being trialled right now. Essentially, the customer is identified by their mobile device, the store computers recognise as items are removed form the shelves, by sensing the change in weight, then the customer is automatically billed as they are detected leaving the store.

Of course, it changes the nature of the assistance that customers would need as well as the training that staff get, but could even be augmented with automated help systems to reduce the expense of first line support.

What next?

Contact us at for a chat about what you want to achieve and we can advise how you can get what you want, even if it’s not with us, we can tell you where issues may arise and become a valuable advisory and senior engagement point to ensure understanding between business requirements and technical solutions.


Death By Smart Meters?

There has been a trend of very anti-smart meter posts being pushed into the public eye, however there is an equally worrying lack of proper research and knowledge.

Firstly, a lot of the articles state “facts” that simply are not true. Be careful before you buy into the validity of an article and be sure to do your research. I know, you don’t have the time, but in that case, don’t form an opinion at all.

The smart revolution is about more than just meters and bills. It’s hard for everyone to understand the benefits, but these are the things that will lower your monthly bills when the data collected is demonstrating how useful it can be to energy companies.

Incidentally the information is still owned by the consumer in the UK, unlike in other companies. So for all the foil hat brigade who don’t want to be seen on the grid, your requirements are covered! You have to elect to send data more frequently that meter readings are currently taken.

With the new meters, you will also be able to pay for your energy up front, not suddenly be made broke when the quarterly bill comes through the letter box and nearly flattens the family pet!

Smart Meters will also open up the market to competition. Experts in the utilities sector have seen more small energy companies applying to join the DCC network and help offer different products that would not otherwise have been possible.

When naysayers spout statistics about cost versus benefit, I have noticed that the figures are generally not even in the realm of reality, but also fundamentally misunderstand the technology. One article was even misguided enough to believe that a smart phone could be used instead of a smart meter.

The long and short is that the smart meter network is very complex and has to be insanely secure. Years ago, one of the main IT consultancies for *nix servers were advising customers that they could not guarantee security if they allowed physical access to their servers. Yet now, we are proposing to deploy devices into the homes of people who are motivated to tamper with the data being transmitted from them. How naive would you need to be to think that epic task could be replaced with a few weeks of coding?

This programme doesn’t stop there, it is the precursor to a number of advances that are depending on the data networks being deployed for 2020. This is the biggest private network in the UK and the analysis of the combined data being received from all the millions of homes and businesses will provide a level of insight previously not possible. Imagine if the utility companies find trends that could reduce expense and usage, such as deploying devices to disable heavy load devices during peak consumption times?

When lots of people turn on a device at the same time, imagine how much energy is lost over the power cables due to heat and load – simply a 1 kw heater from 1 home is nothing, but imagine a hundred homes in one area – that’s 100 kw over 1 line! Then imagine where each of those lines go to …. Where the network will be able to spread the load, there will be less loss of power and therefore lower cost to the consumer.

This may be a bit of a simplification, but I urge people to do their own homework and consider where the benefits lie in the future. Don’t just shudder at the one off cost, think about how the meters will feed information back to the consumer. For example, one test home discovered that their oven was coming on at 3am every day because they didn’t know the timer was set, then by the time they got up at 7am, the oven was cool. That’s 2 kw for 2 hours every day already being saved for that home!

Don’t Let Your IT Outsource Take You To Hell!

If you’re in the position of choosing an IT services outsource partner, you’re probably looking at the big names in IT and going cross-eyed reading documentation claiming that they are all the foremost authorities in your business, IT and the technology you want. However, there is something more fundamental that you need to consider before you sign on the dotted line in order to avoid disaster …

Years ago, I was looking for a delivery partner for a client who had hired me as a programme manager. The specification was simple enough and I just wanted some-one reliable who would work with the business and the rest of the team. Having met with a lot of sales people and seen some great brochures and flashy presentations, we decided as a team to go for the company who wasn’t the cheapest, who wasn’t the biggest and also didn’t have the longest track record.


Well, to be honest, it was because they listened, thought and responded appropriately.

This is something we have made the core of sales at – you can’t fake it, you have to actually care about the customer’s delivery and this outfit clearly did. Because people will want to know, we ended up hiring a very young Tibco, who have gone on to grow and be very successful in integrations (a company I’m always happy to work alongside as part of a transformation).

Here are the most common mistakes I’ve seen directors make when hiring outsource companies:

  • Going for the most well known company – it’s safe, but will cost a whole lot more and usually will not provide exactly the solution the customer needs.
  • Not thinking outside the box – often the best solution is not the one being chosen by all your competitors as it may well not suit you!
  • Looking for the company with the best sales team – If you buy a sales effort, you’re not buying the delivery team you’ll really be working with.
  • Buying the cheapest solution – A lot of sales teams will give you a price as a loss leader to open the door, then over charge on the smallest of additional requirements or operational costs and support.

Be bold enough to demand to have contact with delivery people and ask all the difficult questions that you need an outsource partner to know. Such as “How will you guarantee knowledge of our business?” or even “What are the biggest challenges you expect to face during the delivery?”.

It tends to be the simple and direct questions that separate between a financially driven and a results driven company. Push them to take the time to understand your pressures and issues and above all, make sure you get on with the people you will need to drag into the office when everything isn’t going as well as originally planned, because they are the guys that will sort it out!

In my experience, choosing the wrong partners leads to late deliveries, inflexibility beyond the contract (regardless of how easy it would be) and lower quality.

I have seen poor choices driven by assuming a big name delivers lead to out of control projects and programmes, so make sure you get regular updates, see results and have the right contracts for when you do decide to sign!

If all else fails, hire us for project recovery! Humour aside, good luck and happy research when you look for your ideal technologists.

Don’t Watch Your Integration Programme Fail!

“How much data do you have? And oh my goodness, it’s stored everywhere!!”

I really have heard this before, and it may even have contained more expletives! This came from a sales person trying to convince a client of mine to adopt their data management solution. It was a scare mongering tactic to try and shock the client into thinking that only this solution would work, but fortunately it didn’t succeed.

With any integration programme, there are a few factors that need to be considered:

  • Where is the customer’s data?
  • How does the client wish to interact with the data?
  • Are there inter-dependencies between the data sources?
  • How does the client want their user interfaces to work?

Obviously, these are complex considerations and need some real understanding in terms of business processes, user journeys/stories and the existing legacy solutions before we can even consider designing a solution, but at a high level, all integration programmes can follow the same basic process if we look at them from a Service Oriented Architecture (SOA) perspective.

At the top of the diagram opposite, we have the user interface that connects directly into the integration layer.

The integration layer worries about interfaces and linking application servers as well as data sources together.

Then we enter a standard cycle of pre-processing data augmentation for the application server to work with.

Next the internal function of the application server, before post-processing the data to reply to the integration layer.

This may seem a little daunting, but simply speaking, you can have a number of front end systems connected to the integration layer, then a series of services that connect to the integration layer to process requests from the users.

By separating solutions out in this way, it makes any integration programme as complex or simplistic as you need it to be. For example, consider an intranet front end and a customer front end being serviced by two application servers. It will mean that you do not need to fully replicate code on each of the application servers for each purpose, you can respond only with appropriate and authorised data and you can have entirely independent user front ends. In addition, if extra data were needed anywhere in the cycle, you can simply gather it directly on the integration layer.

Another example is where you are migrating form an old database to a new one. Your application cycle can pick up data from the old source, check it against the new source and keep the new source in the best condition possible. This can then avoid a big-bang change over; which is desire-able if you prefer not to disrupt your customers or business!

In terms of a programme, you can introduce an integration product first, without replacing any of your existing services, so your first stage gate immediately returns potential business benefit.

If you are embarking on a change programme, get in touch and see how can help you!

Are Operating Systems Out Of Control?

It seems that technology has leaped forward and is accelerating all the time. Yet even the design of the modern CPU is simply an augmentation of the original 1940s design. Also the way we use media and want to be able to use devices has evolved and yet the underlying security considerations are very much behind the times.

In fact, with the widespread usage of cloud storage and computing, generally we are simply creating daemons that inter-connect, rather than operating systems that can intuitively interact with each other. Even when an OS can integrate with storage, there is not a clean way to manage updates to/from devices for shared resources, making everything dependant on your network connection. Plus you may need to update a given file from the cloud to each of your connected devices, rather than intelligently sharing across the shortest connection.

With the introduction of DNLA, there is the potential to throw screens from a device onto a television or other device, yet the adoption and ability to move from one protocol/media stream to another is still very limited without an advanced understanding of the underlying technology.

Frankly, without fully revising current platforms, we are forever putting a bandage over the wounds caused by outdated concepts, programming languages and protocols that don’t adequately leverage the advances we have made in technology.

One of the languages that is closest to assembly (that is the code spoken by the CPU) is C. But we are still in a world where C code needs to be compiled for each different hardware, which is why you need to download a binary file that has been pre-compiled for your device, rather than just having a file that can run on any piece of kit. Plus writing code in C is complex and excludes people who do not want to get to know the inner workings of a PC. With a language like Java, you have the advantage of the language being more adaptive to intermittent connectivity, with java beans (I’m not joking, they are called that!), but it needs an additional interpreter on the OS, which will always be slower.

In short, if the base system were to be re-thought totally, intuitively object oriented, based on encrypted architecture and capable of syncing shared items between user owned objects and have a compatibility layer with the hardware, we could revolutionise computing around the world.

For the non-techies, you would be able to have a totally user secured system that will let you stream a microphone to your TV regardless of which device, as well as view the file you worked on in the office on your phone, laptop or anywhere because your access is controlled by your identity. Integrating this idea with bio-metric identification would allow you to roam without caring where you are and ensuring your single password/identification is secured to you.

I have kept this exceedingly simplistic to be accessible to the non-techie community, but obviously there are some complex technical considerations that everyone will be relieved I will not go into. Suffice it to say, a revolution is needed and I think inevitable.

Are You Losing Control Of Your Data?

As we move further into the 21st Century, we enter into a world that is more interconnected than ever. With the Internet of Things (IoT) growing all the time and customer data being sprayed into various clouds at a pulse racing rate, do we even exist as people any more? Are we all simply the sum of our data parts, our electronic profile, or is there something being missed by big data?

Consider it this way; when we search on our favourite search engine, or try to find an insurance quote with our fluffy toy endorsed aggregation site of choice or even sell the tasteful socks grandma was kind enough to send last Christmas that just don’t seem to fit with your wardrobe; we are generating massive amounts of data. This not only goes to where we intend, but is then passed between companies, accumulating collateral information as it goes. This means that next time we login to browse the web, we will have enhanced targeted advertising that is conscious of our preferences and driven by the context of what we are likely to want.

With all this in mind, can any company in the modern market afford to neglect the importance of their data?

Of course, we can’t all be fortunate enough to be as well funded as search engines that rhyme with “oogle” or sound like the microwave is finished! We each have to find out what our customer data means to us, which can be a very tricky process.

Firstly, you need to understand from your business perspective exactly what you need to know about your customers. In short, what insight are you trying to gain? Also, how will you then want to use this insight and what is it you are trying to achieve?

For most companies, it will be sales, image or customer retention. For example, if you have a legacy CRM which does not connect with your SMS or email marketing platforms and is also separate from your product database; you will find that there is a significant lead time in creating a new marketing campaign, updating customer details and understanding the latest consumer needs.

It’s just embarrassing to send out a mail shot to a customer who has just complained about receiving mail shots in an email, but has phoned up about another product that they may be interested in, isn’t it?


If your company doesn’t know what data it is looking at, or how to organise it, then it can seem like this poor chap, standing, staring at a mass of information with no idea what to do!

So, data analysis to the rescue! Or is it really that simple?

Well, no, of course it’s not. For a start, you will be trying to decide which analytics package to go with, if you should in-house or outsource, if you need to integrate your solutions and how the heck to keep juggling everything while you’re still trying to run a business!

The bad news is that it can be very expensive to implement a transformation programme, but the good news is that you do not need to do everything all at once and also, if you go to a company that really understands data analytics and will take the time to talk to your business, it will save an awful lot of messing about as well as cost!

This is not a simple one-answer-fits-all, so make sure that the level of analysis is appropriate for your company.

I have recently advised a London based restaurant chain that they did not need to launch a multi-million pound integration programme with full predictive analytics! For their business needs, it would have been like using a nuclear powered hammer to crack a pickled walnut! Instead, I advised them to consider their current solution and what they actually wanted to see. As it transpired, a new CRM and Marketing platform with a one off data cleanse from their multiple sources was enough to fit their needs as well as to future proof the solution for at least a decade! (this removed at least a zero from their original estimates and provided a lot more business value as well as a smiling customer)

So, although data analytics is essential for any size of business, there are different ways to analyse data and a whole variety of different tools.

My advise to SMEs out there is to ensure you at least have a plan around what you will be using your data for, otherwise it will, sooner or later, feel like you need to vanquish a giant data beast that has chewed through its leash and gone on the rampage! This consideration will mean that you can start off with a spreadsheet, then migrate to a simple database, then move to a dedicated solution and so forth until you find your dream integrated platform (filled with obligatory rainbows and unicorns) without ever being developed into a corner or ending up with BIG data breathing down your neck and stealing your lunch money.

Plan, take appropriate action and make sure you have the support of a knowledgeable IT partner to guide you through the labyrinth of data analytics and your company will be able to get on with what you do best!

How To Stop IT Change Stalling Your Business!

A few years ago, I was asked to lead an IT change programme with the expectation that the business change programme would follow a few months later. Although it seemed sensible not to bite off more than could be chewed and to separate out these activities, it did incur a lot of project debt and delays in the process.

The idea that the business can advise on requirements without understanding how it should be functioning lead to a wild goose chase for authoritative answers to business questions, as well as a solution that no-one could agree on.

The alternative would be to put the business change first, essentially embarking on a business change programme without the appropriate IT infrastructure to support the new business processes, creating a lot of technical debt and unworkable teams, governance and approaches.

Personally, I believe that the only real way to initiate change is to take thin and compatible slices of business and IT change and unify them to form more of an agile process. For example, assign business owners to be authoritative over different areas of the business as well as over systems and solutions that they will need.

This forms a first stage – assessing where the business and IT are, from there, the owners can start to decide where their areas need change, this allows the ability to build a picture of what the transformation road-map should look like. This level of clarity allows the whole business to understand what their role will be during the transformation and to contribute productively towards realising the vision of what will be needed.

Of course, this will require consultation across the company with staff at all levels to ensure that consumers of the new processes, systems and technologies have their say and can receive appropriate training, as well as understand the new governance, processes and interfaces. This makes any transformation programme an evolution that should change the goal slightly as more understanding comes from each iteration of change. Like referring to a map as you take each step and modifying your understanding of exactly where you are standing and where your next step should be.

In conclusion, even if you do want to separate business and IT change, they should at the very least have communication to align changes and ensure each is prepared to resolve issues of the other. This will allow IT issues to be aired to the business and vise versa to ensure full support is provided as needed for the best outcome; even if the other side limits itself to minimal critical changes until a more appropriate time.

Why Gamers Make The Best Project Managers!

As the world of technology marches towards virtualisation and immersion, the ability to blend between the worlds of reality and virtual reality is more important than ever. Now I’m not suggesting that going on a killing spree in Doom would be a particularly useful office skill that is transferable, but I am talking more about games that show progress, resources and most importantly a clear summary of progress.

At a high level, a PM is controlling:

  • Quality
  • Timescale
  • Budget

The holy trinity of project management are almost always captured in some way in the main display of most games, with a slick way to drill down to the details as needed. All too often a project can be side swiped by the background noise covering up issues or even misrepresenting the importance of particular risks.

However, by being prescriptive about the structure and tracking of information, heads up displays effectively control every layer of data in a way that a lot of projects fail to.

Let’s face it, if games did not represent the high level information effectively, then people would not play them. This is something that the mobile gaming industry understands all too well!

This even extends to modern operating systems. Think about swiping down on an android, where it shows the latest alerts. It has adopted the idea of flagging risks and issues in an easily consumable list that users can take on-board. Further still, the screens have clear icons to hold information and applications for users to consume, with the ability to group apps into a hierarchy.

Simply put, the expectations of some-one who frequently uses games, apps and technology are moulded into having a few screens that provide every piece of information that is needed. This makes PMs who expect clarity and uniformity as well as can represent this information to their seniors in the same way.

After all, who wants a 40 page report that may be thorough, but takes hours to consume all the information?

“At a glance” documentation conveys every piece of information quickly with the ability to drill down through the hierarchies and quickly reach critical data without having to read through EVERYTHING!

As well as the need for simplicity and ability to quickly communicate information, gaming trains people to keep trying and that there is always a solution. This is something at the core of the entrepreneurial spirit! The “can do” attitude can be contagious and turns a losing team into one that can win time and time again.

So, next time you interview a PM, think about the hours of training they may have been doing in their private time to enhance theirs skills to control, allocate work, manage resources and keep track of budgets, timescales and quality, all while promoting success!

How To Solve Problems In Your Software Deliveries

Having worked with some huge and some not so big companies, both as clients and as vendors, I have found that there are common patterns of behaviour that result in success.

This includes:

  • Teams having a clear view of what the finished project will be
  • Good communication between team members and simplistic documentation
  • Regular meetings that are aimed on success, not on blame
  • A good understanding of the requirements
  • Frequent demonstrations of functionality (even with a temporary user interface)

Generally mistakes that prevent delivery tend to be when people are trying to justify their role, rather than a team attempting to do what they do best; acknowledging that not everyone knows everything all the time, but that the whole team is needed to be the moving parts of the delivery machine.

Typically, not knowing how we get from a requirement to a deliverable, as well as what the minimum viable product looks like causes frustration, delays and general misery.

In my experience, most people really want to be productive and it’s up to senior staff to provide the clarity and confidence for their team to do what they can. So celebrate the successes you have and make sure everyone on a team gets the recognition they deserve. Encourage people to express something they have done that is just plain awesome and ensure people can speak up when they need to.

Egos are the enemy of group productivity and it is human nature to fight fire with fire, so be sure to be strong as a leader against those who seek to throw blame, even if it is not at you. Those quick to blame are slower to find solutions and those worried about taking blame will not step forward to find a resolution, but try to hide the mistake, causing it to grow in the code base.

Infrastructure is another problematic area. For any company still using all physical infrastructure, you will get what you deserve! Have a way to spin up additional environments for your development and test teams, then you can spend more time getting the job done as opposed to waiting for hardware to be installed, networked, OSs setup and software configured. Use a default build that services most requirements, so that at a pinch you can just grab “one more of those”!

Getting bogged down in the details can harm delivery. It is important to have processes and to make sure that low level issues are designed and addressed, but remember to get the whole team to step back from time to time and look at what the headlines are. You never know, team members with a fresh perspective could change the destiny of the project.

Supporting tools need to be developed. Although they are not part of the end game, most project directors overlook the need to have tools that, for example, check data across sources, or aid with testing.

Gaps in legal agreements between vendors or between delivery silos can make life difficult. Typically, areas of ownership are clear at the start, then become more marred as time goes on. Firstly, make sure clear lines of ownership are understood and regularly revised. Secondly, ensure there is scope to revise areas of ownership without incurring significant cost. This can take negotiation, but you are better off setting some overlap from the outset and then allowing for people to do a little less. This will raise costs at the start, but at least you will not go over budget.

Do not pick the cheapest tenders from your vendors. Generally you do get what you pay for and a cheap tender means a cheap product. Telling vendors that you pick the average cost of tender makes them really think about what they’re doing and will ensure you get what you need. Always add specifications that ensure you have resources dedicated as needed, not shared on other projects and that consideration has been given to unknowns and delivery support as well as their ability to adapt and change as needed.

Remember that anything that is developed needs to be supported. Don’t overlook the handover to operations and make sure everyone is ready and has signed off their ability to catch!

Apart from that, just keep smiling. Even when things are not going as planned, being a misery brings everyone else down. As long as you are working hard, thinking things through and trying for the best outcome, you have no need to frown. It’s up to each member of a project team to make work somewhere people want to be in order to hit goals and make the client happy!