In enterprises, applications execute business processes and often create information arsenic a broadside effect. Traditional business reporting uses this information to supply financial insights and thief amended processes and products. But exertion information is often fragmented and siloed crossed analyzable endeavor exertion estates. In addition, it tin beryllium difficult to propulsion nan information retired of nan exertion database aliases nan SaaS work it lives in.

Enterprise exertion information fragmented into disconnected siloes.
Individual users and information engineers often build custom, batch-oriented information pipelines for circumstantial usage cases. Fragmentation, plagiarism and inconsistencies crossed these exertion information silos and pipelines forestall information sharing and reuse.
Data locked up successful a azygous exertion that can’t beryllium shared pinch different applications will beryllium harder to reuse compared to information that tin easy beryllium shared crossed nan enterprise. Shared information tin go reusable building blocks for caller and improved applications, services and products. This drives amended decision-making, improves business processes and merchandise design, and reduces risk.
What’s needed is simply a measurement to move endeavor exertion information to a centralized, shared, unfastened level that can:
- Decouple sources from destinations to trim complexity.
- Store shared information that tin easy beryllium reused by existing and caller applications.
- Leverage shared information arsenic events to thrust exertion behavior.
- Integrate shared information pinch existing exertion information sources.
- Allow entree to shared information successful a real-time stream, humanities batch aliases some arsenic necessary.
- Achieving much reuse successful a short play of clip by exploiting simple, repeatable data-sharing usage cases.

Enterprise information integrated pinch Kafka.
Kafka Is nan Centralized, Shared, Open Platform You Need
Data architects are looking for shared, real-time, event-driven and unfastened information platforms that are wide used, proven and unfastened root aliases based connected unfastened root protocols.
Apache Kafka decouples information sources from destinations, eliminating point-to-point complexity. Producers people events to topics without knowing consumers, and consumers subscribe independently, removing nonstop limitations and nan request to modify applications. Topics go building blocks for event-driven applications and analytics.
Kafka’s event-driven, pub-sub exemplary allows teams to adhd aliases alteration information flows without rubbing root systems aliases existing pipelines. In contrast, REST APIs are much synchronous and require tighter integration pinch exertion software. This tin summation development times and risk.

As unfastened root package pinch a immense ecosystem of connectors and tools, Kafka enables enterprise-wide standardization, allowing immoderate squad aliases exertion to stock and reuse information pinch less civilization integrations aliases vendor lock-in.
Kafka stores information durably arsenic an immutable log, turning transient events into reusable building blocks accessible to existing and early applications. With configurable retention, topics tin sphere afloat history aliases caller windows, enabling replay for onboarding caller systems aliases recovering from failures.
Legacy ERP systems tin besides use from Kafka by streaming bid events utilizing Kafka Connect for analytics, fraud discovery aliases caller AI services without re-extracting information from nan source. This shared, persistent information furniture breaks silos and turns exertion information into endeavor superior fresh for wide sharing and reuse.
Kafka tin present some real-time streaming and humanities batch entree while improving information reusability utilizing simple, repeatable usage cases. Real-time consumers powerfulness dashboards aliases alerts. Kafka tin move information to unfastened lakehouses for illustration Apache Iceberg for reporting and analytics.
The faster information tin beryllium reused, nan faster caller projects tin happen. Quick wins — specified arsenic syncing a database to a hunt scale aliases feeding logs to information devices — tin beryllium deployed quickly utilizing prebuilt connectors. These low-risk, high-impact patterns standard predictably, reducing integration costs and accelerating innovation, including AI experimentation and deployment, each without infrastructure overhauls aliases disruptive exertion changes.
How Data Architects Can Use Kafka successful Application Integration
Data architects tin usage Kafka for exertion integration, modernization and evolution. Some companies are utilizing a elemental shape to execute this, called an entity builder.
An entity represents a “noun” successful nan business domain (such arsenic Customer, Order, Product, Employee, Department, Invoice). It is unique, described by a group of attributes and exists independently aliases dependently successful nan system. Business processes are built astir entities, their relationships and attributes.
Applications shop entities and their relationships successful a database. After firm acquisitions aliases nan onboarding of caller SaaS applications, it’s communal for nan aforesaid entities, for illustration a customer, to beryllium dispersed crossed aggregate applications. Merging applications is often impractical. Instead, architects fto each exertion specialize its operations to a peculiar process aliases service, but usage Kafka to build a centralized log that allows nan sharing of communal entities.
If nan entity is simply a “Customer,” each exertion saves each its “Customer” records to a taxable and accounts for immoderate changes made to these records. The changes tin beryllium fed to a elemental event-driven microservice, aliases you tin constitute aliases widen a monolith to grip it. Ideally, you’ve done immoderate information modeling activity up beforehand to make judge nan business owners, information engineers and developers are connected nan aforesaid page connected what a “Customer” is and what attributes a customer has successful each application. But you don’t person to do a batch of information modeling to get started and to get value.
Simple Data Modeling To Support Data Liberation
Think astir nan elemental usage cases first. Get nan exertion owners together to build a information exemplary astir a shared knowing of nan “Customer” entity. They request to place and work together connected what a customer is. Sounds simple, right? But things tin get analyzable fast, and you want to archive this arsenic precisely arsenic imaginable up beforehand earlier you commencement nan creation work.
Your exertion owners mightiness beryllium reluctant to see sharing their data. If cajoling nan app owners to activity together doesn’t work, find a elder executive who tin beryllium sold connected nan business benefits that tin move nan needle. It tin thief to constituent to nan business worth and opportunities from sharing this accusation successful a Kafka system.
For example, successful nan financial services industry, let’s presume we person 4 applications for banking, in installments cards, investments and payments. To amended understand nan meaning of a “Customer” entity crossed these applications, a information modeling process could inquire nan pursuing questions:
- What customer identifiers (social information number, day of birth, passport number) are shared crossed applications and are unsocial to each customer?
- What accusation associated pinch nan unsocial identifier tin we usage to create a superior cardinal for joins crossed each exertion customer entities?
- What attributes are astir useful extracurricular nan circumstantial exertion and should beryllium denormalized and shared pinch nan customer entity?
- What customer accusation would amended nan business effect of their application?
- What accusation shared should beryllium protected from inappropriate entree based connected privacy, information and compliance requirements?
- As you create derived aliases aggregated values for your information model, make judge nan values and dimensions are calculated consistently.

Liberated exertion “Customer” entities shared via Kafka.
Once you person nan shared information exemplary and knowing of nan customer entity, you tin create Kafka topics based connected this shared exemplary from each application. Then nan streams tin beryllium joined truthful that 2 aliases much streams, 1 from each institution containing customer IDs, tin beryllium utilized pinch a peculiar action of columns/attributes for nan usage lawsuit astatine hand. This is an illustration of really nan information modeling process besides incorporates nan materialized views you want for downstream applications. It besides shows nan event-driven information integration’s powerfulness to support caller usage cases pinch liberated data.
Simple Application Development, Incremental Changes
For “Customer,” you tin observe immoderate alteration (new address, changed balances, caller payments, etc.) and make an arena to different microservice to grip that peculiar alteration (broadcast reside updates to different apps, make offers based connected nan caller address, compliance-related work, etc.).
This is simply a elemental measurement to get started.
But make judge to decorativeness nan task to nan end. Then measure what went correct and what went wrong. Is location a shape successful requests from users you couldn’t meet owed to thing missing successful either your arena exemplary aliases information product? What were nan biggest implementation challenges? Did you take nan correct information product? Did you contextualize nan events successful a measurement that matched depletion patterns? What is nan adjacent entity that could beryllium externalized successful a information watercourse that mightiness adhd worth to nan existing 1 (“Customer” successful our case)?
This information liberation attack avoids changing monoliths for illustration bequest applications aliases their information models. You tin proceed to usage it for operations, but modernize astir it.
It lets you commencement building event-driven applications, elemental but useful ones, correct away. And real-time information sharing betwixt applications becomes possible.
Conclusion
Integrating caller information into existing, mature applications has ne'er been an easy task. But Kafka provides mechanisms to lick this problem utilizing building blocks that stress sharing and reusability.
The elemental enactment of sharing arena information crossed applications pinch Kafka creates business worth without requiring costly aliases analyzable exertion changes. It provides opportunities to modernize endeavor information incrementally pinch reusable information building blocks.
YOUTUBE.COM/THENEWSTACK
Tech moves fast, don't miss an episode. Subscribe to our YouTube channel to watercourse each our podcasts, interviews, demos, and more.
Group Created pinch Sketch.
English (US) ·
Indonesian (ID) ·