How to Use Open-source Platforms for Master Data Management
Managing tremendous quantities of data requires a platform fit for the job.
Data is often referred to as the new oil of the digital economy. The comparison may not be true in all respects, but it's quite useful when illustrating two essential points: data is valuable, but must first be collected and processed before it can be used.
Many companies are currently confronted with these massive tasks on their way to digitization. And it's not just about big data; it's about how fast and in which quality basic information such as customer data, material data, and article data are available. That means master data maintenance or master data management (MDM).
Characteristic of master data is that this information remains unchanged over a long period of time and is used as a valid version or "golden record" in several transactions made from multiple locations. In times of digitalization, master data in companies should meet two requirements. They should be complete, accurate and up-to-date (data quality) and be quickly available throughout the company.
This requires a lot of work. The effort, however, is relative, considering that data use, in any case, requires working time, which is always better invested in the ordering of the data, as opposed to the searching of data. An example: a photo of a particular product is required for a new product flyer. After some searching and various phone calls, the responsible marketing employee sees that there are corresponding camera shots in business unit X. There are a series of photos on the drive (let’s say more than 30 shots), some of which are very similar (but which can't be seen in the image preview). It takes time to click through everything, and the employee’s colleagues in the business unit are busy with their daily activities. Finally, the marketing manager receives some pictures sent by e-mail, followed by a discussion about the selection and release for publication and matching captions – in short, this process takes a lot of time, every time a photo is needed.
Invest time in ordering, not searching
If, on the other hand, one invests only part of this working time in the selection of the two or three best images, a precise image description, caption and documented image release, a confusing series of photos becomes a valuable data record – a digital asset.
This means that the first requirement, namely complete and correct data, is already fulfilled. Another area of master data is ordered.
Many companies are currently at this turning point: systems for master data management are already being used for some areas of application, mostly in production-related areas, procurement and purchasing in the form of robust ERP systems, while other areas are still searching for suitable software solutions. These are often marketing or product development departments. There are also programs on the market such as CRM, PIM or DAM Software.
Does this mean a separate database system for each department or application? This idea is likely to cause a lot of discomfort for IT managers. It's good if the CTO or CIO vetos it to push for a more comprehensive solution. Not because the IT department will operate in a heterogeneous landscape that will drive it to a breaking point, but because this clean up in storage sites just doesn't solve the problem. Master data is not readily available throughout the company.
Why is purely application-related master data maintenance not a solution?
For the above-exemplified product flyer alone, much more data would be required: product data and key figures, logos and address data of the respective distribution partners and comparative figures. And this is just one example. Whether for the use of cross-selling potentials, the competent support of social media channels, answers to customer inquiries, reactions to a crisis or taking advantage of opportunities, development of new product ideas or pilot projects – it's always about the time-to-market or time-to-launch factors. The quick availability of correct data thus becomes an ingredient for success.
However, if the user has to log in to several databases to collect the information about the cache, this simply takes too long. But, as employees are also interested in working efficiently, assets or search results that are likely to be needed more often find their way from the cache into their own repository, disconnected from the source and subsequent updates – to save time, for the next time around. And the accuracy, completeness, and consistency of the master data get overlooked once more.
The solution can only be a cross-company master data system in which all the required data is mapped and retrieved in high quality, along with the relationships between said data.
Enterprise-wide MDM requires a precise design
Such a project is, admittedly, a different story than an image database. It needs the commitment of management, as well as that of all participants, and several months time. Experienced external partners can nevertheless support and guide the process safely, taking over part of the work.
The linkage of the individual systems is important, but not so at the beginning. In practice, we start with a blank sheet and ask the following questions: What are the challenges in our company that we can solve through more efficient data use? Which entities are therefore important? Where do we find this data and how do the different pieces relate to each other? For a better overview, we recommend cataloging metadata and sources. The quality and usefulness of the individual data collections must also be assessed: some pieces of data are sorted out as unusable, others enriched.
The analysis is not limited to internal or self-generated master data; this is a common mistake. Rather, we should ask ourselves: what external data is relevant in our industry, what sources are recognized, and how can we include this data? This is important for several reasons: for internal evaluations and comparisons, business intelligence and trend evaluation and also for external presentation. Companies that rely on competence leadership, positioning in industry associations and within communities, can effectively support experts from their own ranks by quickly accessing up-to-date key figures.
The modeling of the data should only take place at a later stage when all components and correlations are recorded. Finally, the individual systems must be connected by interfaces so that automatic data matching is possible.
Open-source platforms combine the latest technology with independence and flexibility
No matter how challenging the conceptual preparation of open source is, its technical implementation is easier. A practical and sustainable solution is open source platforms, which offer advantages compared to proprietary systems, such as:
The enterprise is not bound for years to a specific manufacturer; its pricing and product policy acquires the full ownership of the software. Product updates from the supplier are recommended but don't necessarily have to be installed. In any case, the software is constantly being developed by numerous users, faster than what would just one supplier be able to accomplish. Users can therefore always be up-to-date with technology and innovation.
The platform can be flexibly adapted to changing conditions. In larger companies, this can be done by their own IT departments. Companies with fewer resources of their own can commission-free software developers without being tied to certified partners – with their usual high hourly rates. Because the platform is based on common frameworks, learning is quick. In addition, it is precisely young developers from the new generation who prefer to work with open source. This makes companies using such platforms more attractive as an employer of choice for this generation of developers.
Lower costs and greater security
The software itself is usually free. If the provider is commissioned with a first-time adaptation, this will incur the usual costs for the implementation of a solution, but no recurring license costs – and therefore not the expense associated with the license audits. The total cost of ownership is therefore lower.
One of the most common questions is safety. Is a system with open source not vulnerable to hacker attacks? On the contrary, the fact that many users who are interested in their own security have to deal with it every day means that possible security vulnerabilities are discovered and resolved much more quickly. Finally, thanks to the open standards, any system can be integrated into the open-source platform, no matter how little distributed or sector-specific. Inter-operability is therefore ensured.
Seeing data as values
So much for technology. For sustainable success, it's crucial that the newly introduced system will also be implemented. Employees must get used to the structured handling of master data and learn how to store and use data. Governance, provenance and cleanliness guidelines are a prerequisite, but should not be perceived as rigid instructions. What is important is a new understanding: exact product data, current, and complete customer data, but also images and product videos – all of these are values. The ordering of data and the fact that master data is now readily available on-demand across the enterprise is the result of shared effort. It's a valuable commodity: it's less like the machine and closer to the inventions and patents that traditional companies are proud of.
Pimcore Guest Blog published on Business.com
Comments
Post a Comment