As firms like Carousell push extra reporting into cloud knowledge platforms, a bottleneck is displaying up inside enterprise intelligence stacks. Dashboards that after labored superb at small scale start to decelerate, queries stretch into tens of seconds, and minor schema errors ripple in reviews. Briefly, groups discover themselves balancing two competing wants: secure government metrics and versatile exploration for analysts.
The stress is changing into frequent in cloud analytics environments, the place enterprise intelligence (BI) instruments are anticipated to serve operational reporting and deep experimentation. The result’s typically a single setting doing an excessive amount of – appearing as a presentation layer, a modelling engine, and an ad-hoc compute system without delay.
A latest structure change inside Southeast Asian market Carousell exhibits how some analytics groups are responding. Particulars shared by the corporate’s analytics engineers describe a transfer away from a single overloaded BI occasion towards a break up design that separates performance-critical reporting from exploratory workloads. Whereas the case displays one organisation’s expertise, the underlying downside mirrors broader patterns seen in cloud knowledge stacks.
When BI turns into a compute bottleneck
Fashionable BI instruments enable groups to outline logic immediately within the reporting layer. That flexibility can pace up early growth, but it surely additionally shifts compute strain away from optimised databases and into the visualisation tier.
At Carousell, engineers discovered that analytical “Explores” have been steadily linked to extraordinarily massive datasets. In line with Analytics Lead Shishir Nehete, datasets typically reached “tons of of terabytes in measurement,” with joins executed dynamically contained in the BI layer, not upstream within the warehouse. The design labored – till scale uncovered its limits.
Nehete explains that heavy derived joins led to sluggish execution paths. “Explores” pulling massive transaction datasets have been assembled on demand, which elevated compute load and pushed question latency increased. The staff found that 98th percentile question occasions averaged roughly 40 seconds, lengthy sufficient to disrupt enterprise opinions and stakeholder conferences. The figures are based mostly on Carousell’s inner efficiency monitoring, which was offered by the analytics staff.
Efficiency was solely a part of the problem: Governance gaps created extra threat and builders might push adjustments immediately into manufacturing fashions with out tight exams, which helped function supply however launched fragile dependencies. A tiny error in a discipline definition might trigger downstream dashboards to fail, forcing engineers to carry out reactive fixes.
Separating stability from experimentation
Reasonably than proceed to fine-tune the current setting, Carousell engineers selected to rethink the place compute work ought to stay. Heavy transformations have been transferred upstream to BigQuery pipelines, the place database engines are designed to carry out massive joins. The BI layer shifted towards metric definition and presentation.
The bigger change got here from splitting tasks in two BI situations. One setting was devoted to pre-aggregated government dashboards and weekly reporting. The datasets have been ready upfront, permitting management queries to run towards optimised tables as a substitute of uncooked transaction volumes.
The second setting stays open for exploratory evaluation. Analysts can nonetheless be part of granular datasets and take a look at new logic with out risking efficiency degradation of their government colleagues’ workflows.
The twin construction displays a broader cloud analytics precept: isolate high-risk or experimental workloads from manufacturing reporting. Many knowledge engineering groups now apply related patterns in warehouse staging layers or sandbox initiatives. Extending that separation into the BI tier helps keep predictable efficiency beneath progress.
Governance as a part of infrastructure
Stability additionally trusted stronger launch controls. BI Engineer Wei Jie Ng describes how the brand new setting launched automated checks by Looker CI and Look At Me Sideways (LAMS), instruments that validate modelling guidelines earlier than code reaches manufacturing. “The system now robotically catches SQL syntax errors,” Ng says, including that failed checks block merges till points are corrected.
Past syntax validation, governance guidelines implement documentation and schema self-discipline. Every dimension requires metadata, and connections should level to accredited databases. The controls scale back human error whereas creating clearer knowledge definitions, an vital basis as analytics instruments start so as to add conversational interfaces.
In line with Carousell engineers, structured metadata prepares datasets for natural-language queries. When conversational analytics instruments learn well-defined fashions, they will map person intent to constant metrics as a substitute of guessing relationships.
Efficiency beneficial properties – and fewer firefights
After the redesign, the analytics staff reported measurable enhancements. Inner monitoring exhibits these 98th percentile question occasions falling from over 40 seconds to beneath 10 seconds. The change altered how enterprise opinions unfold. As an alternative of asking if dashboards have been damaged, stakeholders might focus on evaluating knowledge stay. Simply as importantly, engineers might shift away from fixed troubleshooting.
Whereas each analytics setting has distinctive constraints, the broader lesson is simple: BI layers mustn’t double as heavy compute engines. As cloud knowledge volumes develop, separating presentation, transformation, and experimentation reduces fragility and retains reporting predictable.
For groups scaling their analytics stacks, the query isn’t about tooling selection however round architectural boundaries – deciding which workloads belong within the warehouse and which stay in BI.
See additionally: Alphabet boosts cloud funding to satisfy rising AI demand
(Picture by Shutter Speed)

Wish to be taught extra about Cloud Computing from trade leaders? Try Cyber Security & Cloud Expo happening in Amsterdam, California, and London. The great occasion is a part of TechEx and is co-located with different main expertise occasions, click on here for extra info.
CloudTech Information is powered by TechForge Media. Discover different upcoming enterprise expertise occasions and webinars here.
