CFOtech Australia - Technology news for CFOs & financial decision-makers
Story image
Big data projects, cloud migrations, and digital transformation: why it doesn’t need to be so hard
Fri, 11th Feb 2022
FYI, this story is more than a year old

“The pandemic has sped up digital transformation” is the phrase on everyone's lips since the pandemic began. But the reality may be slightly different. Although it's true that, in many cases, digital transformation is steaming ahead, some enterprises still find themselves hesitant to embrace the cloud despite the evident benefits, especially for moving big data projects.

When the Australian Bureau of Statistics released its annual Characteristics of Australian Business Survey in June 2021, the results revealed that barely more than half (55%) of all businesses reported using cloud computing in 2019-20. Similarly, across the pond, the European Union's statistical office surprisingly found that cloud computing has yet to go mainstream in the EU, with only 41% of enterprises using cloud computing in 2021.

So while the pandemic appears to have forced many enterprises to take a hard look at their business processes and technology infrastructure, there's still much to be done to see cloud and digital transformation progress get closer to 100%.

So, why is it really time to move data projects to the cloud, what's holding enterprises back, and how can they make it simple?

Flying to the clouds – or falling to the ground?

Put simply, by migrating their big data architecture to the cloud, enterprises can reap a number of benefits: from driving business growth whilst lowering the overall cost of operations, to increasing data governance and having a fast, scalable solution.

Moving big data to the cloud sounds simple enough. Still, it does require a high level of technical knowledge and often continuous coding resources from data engineers and core IT groups. This is why some enterprises either try and fail; or continuously postpone these projects.

For example, many developers write code to integrate with each application's programming interface (API) and authentication mechanisms. While this enables the data to freely move between various applications and a cloud data warehouse or data lake, it is time-consuming and often error-prone. These pitfalls are only emphasised during the maintenance stage of cloud-based big data projects.

As with any other software project, code decays over time and must be updated. Furthermore, if the developer who wrote the code leaves the company, often the IT organisation's ability to understand the pipeline that is being used at the code level also vanishes.

One of the biggest challenges enterprises have had to overcome in moving to cloud-based big data projects is this time drain on IT staff. Finding individuals with the necessary skills and experience to build big data and cloud pipelines is a challenge.

Unsurprisingly, this is further impacted by the ever-growing skills gap in the IT landscape, an issue compounded to some degree by the ‘Great Resignation' spurred on by the pandemic. The Australian Bureau of Statistics estimates that over 600,000 Australians expect to move jobs in 2022. That's roughly 5% of the total workforce.

These talented individuals are in ever shorter supply, and the demand is only increasing. If you do manage to get them into your IT team, having them focused solely on managing and maintaining the plumbing that supports their big data environment, both pre, during, and post-migration to the cloud is not a smart use of resources. It also has a big impact on another big issue in moving to the cloud - cost.

With a team full of highly skilled individuals, you want them to have the time to focus on projects that deliver significant and strategic benefits to the business. The cloud provides flexibility and scalability, which can fuel innovation in the enterprise. However, the proposed time-to-innovation identified at the start of the cloud migration will never be achieved if teams are too busy focusing on infrastructure management to make the big data project work.

Buy vs build 

Finding a solution to this problem could seem tricky, but it is relatively simple, and it comes down to buying vs building. The chances of you needing to self-build every aspect of your IT estate is limited and, for most, cost-prohibitive, so why not look to purpose-built off-the-shelf SaaS platforms?

If enterprises want to see their big data projects flourish in the cloud sooner, they should look towards modernising their data architecture. This includes introducing data integration (iPaaS), processing (BDaaS) and storage (SaaS) to the enterprise. This should enable organisations to seamlessly deliver large data sets to and from their cloud-based data lakes, regardless of where the data is coming from.

An additional benefit of this approach is that it can also increase productivity by eliminating mundane, repetitive manual tasks involving adding information and transforming data, allowing IT teams to free up more time and focus on value-adding activities instead.

Say goodbye to complexities 

Running big data projects in the cloud should be simple. All organisations, regardless of size, should be able to realise all the benefits the cloud provides as soon as they get up and running. It's only in taking a step back at the planning stage and removing the complexities surrounding cloud migration and integration that businesses will finally be able to unleash their big data projects for innovation and to deliver business value.