Data Analyst
Job Description:
About the role
We are building the analytics layer over our ERP app. You’ll own the analytics engineering core: modelling raw change-data into clean marts, instrumenting tests/docs, and shipping decision-grade dashboards that our users can reason over.
Model the warehouse
Design and implement stg_ , dim_ , fact_ , and mart_ layers in dbt for domains like HR, Finance/AR, Inventory, Manufacturing.
Define grain clearly; implement SCDs, incremental models, snapshots, seeds, exposures, and packages.
Create rigorous tests (unique, not null, relationships, custom macros) and data contracts.
ClickHouse performance
Optimize schemas, partitioning, primary keys/order by, materialized views; handle late-arriving data and CDC merge logic.
Own query performance and cost/perf trade-offs.
Dashboards that drive action
Build Superset datasets, charts, and curated dashboards; set conventions (naming, filters, time grains).
Contribute to/consume a chart registry exporter so charts/metrics are machine-readable by our LLM layer.
Governance & security
Implement RLS/RBAC patterns across Superset/ClickHouse; parameterize org/user scopes.
Keep dbt docs healthy; publish manifest/catalog; maintain docs_index.jsonl for downstream search
Ops & quality
Set up CI for dbt (build + tests on PR), freshness SLAs, run artifacts, observability.
Write excellent README/runbooks; coach stakeholders on metric definitions.
Must-have experience
2–6 years in Analytics Engineering / Data Warehousing.
Expert SQL and dbt (2+ years): modular modelling, incremental strategies, macros, exposures, docs, tests.
Strong dimensional modeling and clear articulation of grain.
Production experience with ClickHouse (or very strong columnar DB like BigQuery/Redshift/Snowflake plus willingness to go deep on ClickHouse fast).
Building BI artifacts in Superset (datasets, semantic fields, dashboards, filters).
Comfort with CDC concepts JSON extraction, merge patterns. Git, code review, and basic Docker workflows.
Nice to have
Python for analytics tooling (registry exporters, dbt utilities).
Experience with data quality frameworks (Elementary, Great Expectations, or custom dbt tests).
RLS patterns in ClickHouse/Superset; multi-tenant data modeling.
Understanding of LLM-aware analytics (documenting metrics/semantics so AI can reason over them).
Domain familiarity in any of: HR, AR/AP, Manufacturing/Inventory, Diamond/Jewellery, Textiles.
Tooling you’ll touch
Warehousing: ClickHouse Modeling:
dbt Core
BI: Apache Superset
Ops: GitHub Actions (CI for dbt), Docker
Lang: SQL, Python
Key Skills :
Company Profile
Company is a Cloud ERP for Growing Businesses.
Telephonic Interview Available
- Telephonic interview are scheduled for this job opening.
- Interested Candidates are requested to apply and get recruiter contact number for telephonic interview.
- Candidates can call recruiter on given contact to start telephonic interview during working hours.