Vinoo Prakash is responsible for formulating the product and platform development strategies at SunTec Business Solutions. He plays a pivotal role in defining platform and product strategy and coordinating product life cycles. His responsibilities include evaluation of the platform architecture and technology used for SunTec’s products, development of road-maps and definition of feature requirements. A seasoned professional with more than 15 years of experience in SunTec, Vinoo previously held roles of Head – Pre Sales & Consulting in SunTec.
The urban legend of the library that sank is a good analogy for the need to have a robust foundation in place for any business. In the story of the library, the architect did not account for the weight of the books and the library started to sink. In a consumer facing business, if it does not account for the weight and diversity of data, it’s bound to sink without a proper infrastructure in place.
New channels, be it social media or smartphones, allow consumers to voice displeasure or satisfaction and make it visible to their service provider immediately. Feedback, good or bad, creates a lot of data which could be useful to the organisations’ data analysts. Transactional data, for example created when a consumer interacts with a service provider online, is vital for a business to harvest and store safely. The impetus is also being placed on the way the data is being harvested, stored and analysed, creating a new framework for data analytics.
Databases are the foundation off which companies can compose a strong analytical and architectural framework. This is also starting to become more important with the constant roll-out of Big Data and Internet of Things technologies to improve customer experience and operational efficiency.
Businesses will be at a loss if they stick to storing their data in one place. With the burden they place on just using one instance for all their data, the lack of agility becomes apparent and can lead to downtime for applications. This will not allow them to react quickly enough during business events which require fast response times. This lack of agility spills over into ineffective analytics, which could prevent a new offer being launched, further leading to poor customer experience and retention.
The era of using one database for all business needs is over. The ubiquity and size of data is calling for businesses to take a new approach in regards to handling different types of data as well as storing it. From horizontal to vertical data stores, like NoSQL and Relational Database Management Systems (RDBMS) to in-memory databases, there is a particular database that can solve a particular data problem better than another.
With this revelation comes the Polyglot Data Management strategy. Like the official definition of polyglot – someone who can speak several languages fluently – this strategy suggests that a data management framework would be better served by using several different database “languages”.
This term was originally used in technology in reference to building applications using several different languages to make use of the problem-solving capabilities each of those languages have. From Java to HTML5, the amount of coding languages have skyrocketed thanks to web development and application creation methods like DevOps and the problems each of those languages can resolve.
Polyglot Data Management calls on businesses to use different databases to complete specific tasks for the business, be it deploying batch analytics in Hadoop or storing graphical data from a social networking application.
As a business grows and introduces new digital channels of interaction, data transactions and operations take place using different applications. These applications will sit on top of several different databases, using each for a specific purpose. This means each business will have multiple environments and instances on which applications run, cross-pollinating databases with data from several different applications.
This new method of data management will help businesses to not overload one particular system with a deluge of data. This is critical in a scenario where availability is paramount, especially if an organisation is online and dependent on a database to handle millions of transactions taking place every second of every day. Using this new framework will also allow for streamlined analytics, with each database running its own, natively built, analytics systems and then consolidating those analytics into one place.
It is imperative to determine the types of data the business relies on, not in a technical sense but rather a business sense, so financial data, user activity logs, shopping carts or stockroom data would fall in that category. Knowing these intricacies can then determine which types of data each part of the application is creating, like querying time series or graphical data from a social network platform embedded into the application layer.
When an understanding is gained about what data is created, it is then possible to determine the best possible database which can tackle those tasks with ease. If businesses need to keep customer records without losing any transactional details, RDBMS would be ideal as a permanent store of data. Whereas if they need a fast response speed and scaling capabilities for web applications a NoSQL database would be best as it provides faster access and scaling.
Product innovation in the Financial Services industry, at its core, is the creation of an improved version of previous
One of the main hurdles in a Polyglot Data Management strategy is the fact that businesses would need a database administrator or a team which would be able to manage all databases, which is costly.
With different databases in place for every application instance, businesses can take advantage of their innate problem-solving capabilities. These problem-solving capabilities will afford business the opportunity to efficiently and effectively harvest and analyse data with consummate ease.
Businesses can use a technology architecture based on industry agnostic domains which is trying to make sure that systems are highly available from a real-time and a batch perspective. This ensures the different stages of data manipulation and control can take place seamlessly on the application layer.
By following this rule-of-thumb, analytics and business intelligence can be improved, queries can be launched sooner and in real-time, and bespoke offerings can be created and delivered to those it applies to.
Having a heterogeneous database architecture in place can also yield improved agility in a particular vertical market such as banking. Using traditional and new databases in tandem can execute customer orders faster and approve a financial transaction request between two nations immediately.
Libraries need to be built on a strong foundation, otherwise the building will sink under the weight of all the information locked away in those books. Similarly, businesses need to be built on a solid IT infrastructure which caters for a variety of data created by customers and partners. One way to approach that is to use all the database platforms available to them, which will lead to a Polyglot Persistent future for any company.