This page documents my learning progression throughout a structured Data Technician Skills Bootcamp. Each week combined theory, practical exercises, and applied tasks, with an emphasis on building job-ready analytical and technical capability rather than isolated tool use.
Alongside the embedded weekly workbooks, summaries below capture additional tasks, objectives, and applied work completed during each stage of the programme.
Week 1 — Data Fundamentals & Excel Foundations
Building confidence using Excel as an analysis tool: structuring data correctly, producing summaries, and creating stakeholder-friendly outputs.
Key tasks and objectives achieved
Researched and documented common laws/regulations relevant to working with customer data (why they matter, impacts, and consequences).
Worked hands-on with a retail dataset:
Converted raw ranges into an Excel Table
Performed sorting and basic analysis
Calculated totals and averages using SUM and AVERAGE
Built a Pivot Table summarising sales by county and product.
Added a derived column to categorise products by sales volume using SWITCH.
Practised charting/visual communication using a Bike Sales visualisation task.
Completed a scenario task framing analysis for a board audience (how findings would be presented, including considerations like renewal behaviour).
Week 2 — Tableau + Power BI (Dashboards & Insight)
Moving from spreadsheet outputs to interactive dashboards, and learning how different BI tools shape analysis and storytelling.
Key tasks and objectives achieved
Tableau research: compared Tableau versions and explained limitations of Tableau Public.
Built a Tableau dashboard using the EMSI_JobChange_UK dataset including:
A line chart showing job change trends
A UK map showing affected city locations
Conducted open-scope exploratory analysis tasks:
Spotify dataset: found trends and documented insights that could support future organisational work
Health dataset: identified trends and reflected on how data supports NHS decision-making
Completed multiple guided Power BI labs (evidence captured via screenshots):
Getting data into Power BI Desktop
Loading transformed data
Designing a report
Creating a dashboard
Week 3 — Databases & MySQL (Relational Thinking)
Understanding how data is stored and queried in relational databases, and translating business requirements into database structure.
Key tasks and objectives achieved
Researched and answered questions on database fundamentals and key concepts.
Studied and explained common JOIN types, including when each join is appropriate and what kind of data relationships they support.
Completed a written, portfolio-relevant task:
Designed a database approach for a small retail business
Wrote a structured essay covering:
business requirements gathering
schema design (tables/entities/relationships/keys)
implementation considerations and steps
Week 4 — Career / Professional Development
This stage of the programme focused on professional readiness rather than a technical workbook
Week 5 — Cloud & Azure Concepts (Solution Thinking)
Cloud fundamentals and how organisations use Azure services to store, secure, integrate, and analyse data.
Key tasks and objectives achieved
Researched core cloud concepts and cloud service models.
Explained IaaS / PaaS / SaaS with examples and use cases.
Researched key cloud terms and where they’re implemented in real organisations.
Reviewed security/legal topics (including the Computer Misuse Act themes and related considerations).
Produced a structured Azure-focused proposal section (the groundwork for your Paws & Whiskers project), covering:
Data analysis tooling options (e.g., analytics services)
Data integration and automation concepts
Data types and modelling approach (entities/relationships)
Storage formats and structures
Security and encryption considerations
Backup/disaster recovery planning
Visualisation approach (Power BI)
Scalability considerations
Week 6 — Python for Data Analysis
Using Python to automate analysis, explore datasets quickly, and work with tabular data using pandas.
Key tasks and objectives achieved
Implemented FizzBuzz (classic interview problem) and captured both code and output.
Worked with a real CSV (student.csv) to practise pandas fundamentals:
Loading data into a DataFrame
Viewing head() / info()
Summary statistics and basic exploration
Worked with GDP (nominal) per Capita.csv:
Load and save to DataFrame
Print first/last rows
Select specific columns (e.g., Country/Territory, UN_Region)
Extended exploration tasks as a group: experimenting with additional datasets and outputs for practice.