Senior Data Analyst
Apply NowAt CloudFactory, we are a mission-driven team passionate about unlocking the potential of AI to transform the world. By combining advanced technology with a global network of talented people, we make unusable data usable, driving real-world impact at scale. More than just a workplace, we’re a global community founded on strong relationships and the belief that meaningful work transforms lives. Our commitment to earning, learning, and serving fuels everything we do as we strive to connect one million people to meaningful work and build leaders worth following. Our Culture At CloudFactory, we believe in building a workplace where everyone feels empowered, valued, and inspired to bring their authentic selves to work. We are: • Mission-Driven: We focus on creating economic and social impact. • People-Centric: We care deeply about our team’s growth, well-being, and sense of belonging. • Innovative: We embrace change and find better ways to do things together. • Globally Connected: We foster collaboration between diverse cultures and perspectives.
If you’re passionate about innovation, collaboration, and making a real impact, we’d love to have you on board! Role Summary As a Senior Data Analyst, you will independently deliver structured analytical interpretation across client workstreams and selected enterprise priorities within Enterprise QSE. You will work closely with stakeholders across Quality, Delivery, Workforce, Finance, Technology, and related functions to apply statistical reasoning, hypothesis-driven analysis, and practical analytical techniques to identify performance risk, explain operational instability, and strengthen decision-making. This role is well suited to someone who can operate confidently in ambiguous environments, move beyond reporting into diagnosis and insight generation, work effectively in spreadsheet-heavy operating contexts, and contribute to stronger analytical discipline across the function. Responsibilities Advanced Performance Analysis • Conduct multi-dimensional analysis across accuracy, throughput, SLA adherence, workforce trends, queue performance, and financial or service-risk indicators. • Distinguish natural performance variation from meaningful deviation using structured analytical methods. • Identify likely drivers behind quality dips, adjustment spikes, instability patterns, and workstream deterioration. • Use segmentation to isolate patterns across worker groups, task types, shifts, workflows, or use cases. • Provide clear analytical summaries and practical recommendations to Quality and Delivery leadership.
Statistical Analysis & Performance Risk Interpretation • Apply practical statistical methods to test hypotheses, compare performance segments, and assess whether observed patterns are meaningful. • Use sound reasoning around variance, distributions, trend interpretation, and sampling when analyzing operational data. • Support structured intervention analysis where process or workflow changes need to be evaluated. • Translate statistical findings into practical implications for operational stakeholders.
Sampling Design & Measurement Integrity • Design and refine sampling approaches for system accuracy measurement, performance validation, and targeted investigations. • Ensure sampling methods are representative, consistent, and aligned to the analytical purpose. • Validate accuracy calculations, sample assumptions, and interpretation logic used in reporting and governance. • Strengthen confidence in accuracy measurement practices across workstreams.
Workstream Performance Signals • Contribute to the definition and refinement of leading and lagging indicators at workstream level. • Identify early signs of SLA instability, quality deterioration, throughput stress, rework patterns, or mismatch between internal metrics and client-observed outcomes. • Improve the usefulness of internal performance measures in reflecting actual service experience. • Contribute analytical support to at-risk workstream monitoring and related risk reviews.
Reporting Logic & Data Reliability • Build, validate, and improve dashboards, analytical views, and metric logic across workstreams. • Use strong SQL and practical Python or R skills to extract, join, validate, and analyze raw operational data. • Build and maintain advanced spreadsheet-based analytical models, formulas, validation logic, and lightweight scripts where reporting, control, or investigation workflows still rely on Google Sheets or Excel. • Identify structural data gaps, inconsistent metric definitions, and reporting weaknesses that reduce trust in outputs. • Partner with relevant teams to improve data quality, reporting consistency, and calculation clarity.
Cross-Functional Analytical Partnership • Partner with Quality, Delivery, Workforce, Finance, and Technology stakeholders on complex performance-related analysis. • Translate analytical findings into clear, actionable recommendations for business and operational leaders. • Support enterprise initiatives such as RCA improvement, workflow redesign, incident analysis, and automation-related performance review through structured analysis. • Operate effectively in ambiguous environments where data quality, definitions, or system logic may still be evolving.
Capability Support & Analytical Discipline • Provide review support, practical guidance, and analytical quality checks for junior analysts where needed. • Help strengthen consistency in documentation, metric interpretation, and reporting logic across the team. • Apply structured problem-solving methods, including DMAIC where relevant, to improve analytical repeatability and quality. • Contribute to stronger statistical literacy and analytical consistency within Enterprise QSE through coaching, examples, and review feedback.