Monthly Archives :

maart 2026

As part of the TestingSaaS Skill Maturity Framework: Thinking like an IT-architect

Architectural Thinking: Moving Beyond Operations

Architectural Thinking: Moving Beyond Operations 1536 1024 Cordny

Most engineers don’t get stuck because they lack effort.
They get stuck because they stay in operations mode.

They manage pipelines.
They respond to alerts.
They fix issues.

And they are get very good at it.

But at some point, operational excellence stops translating into growth.

This is where architectural thinking begins.

The Plateau Between Operator and Architect

Within the TestingSaaS Skill Maturity Framework, this is the transition from a problemsolver to a designing architect :

Level 2/3 → Level 4

From:

  • Managing systems
  • Executing tasks
  • Solving known problems

To:

  • Designing systems
  • Anticipating trade-offs
  • Influencing long-term decisions

Most engineers plateau here.

Not because they can’t grow.
But because they are never taught how.

What Is Architectural Thinking?

Architectural thinking is the ability to move from:

“How do I fix this?”

to:

“Why does this system behave this way, and how should it be designed instead?”

It’s about seeing systems as interconnected, evolving structures, not just components.

Key characteristics

An architectural thinker:

  • Understands cause and effect across systems
  • Thinks in trade-offs (cost vs performance vs reliability)
  • Designs for failure, not just success
  • Considers long-term impact, not just quick fixes

The Operational Trap

Operations feels productive.

You:

  • Close tickets
  • Improve pipelines
  • Fix incidents

But over time:

❌ You optimize symptoms
❌ You repeat patterns
❌ You stay reactive

Without architectural thinking, you become:

A highly efficient operator in a poorly designed system

operating in chaos

The Shift: From Doing to Designing

To move forward, your mindset must shift:

Operational ThinkingArchitectural Thinking
Fix the issueRedesign the system
Follow best practicesQuestion assumptions
Focus on componentsFocus on interactions
React to alertsPrevent failure modes
a pro-active architect

Where This Fits in the TestingSaaS Skill Maturity Framework

In the TestingSaaS Skill Maturity Framework, this shift looks like:

Level 2/3 — Operator / Analyst

  • Manages monitoring and pipelines
  • Performs root cause analysis
  • Solves known issues

Level 4 — System Thinking (Architect)

  • Designs systems with intent
  • Understands trade-offs
  • Influences architecture decisions

Level 5 — Strategic Technologist

  • Aligns systems with business goals
  • Optimizes across teams
  • Thinks in sustainability and impact

Architectural thinking is the gateway skill

Let’s illustrate it with some examples.

Example 1 — Performance Issue

Operator mindset:

  • Optimize query
  • Add caching
  • Scale server

Architect mindset:

  • Why is this request expensive?
  • Should this be synchronous?
  • Can we redesign data flow?

Example 2 — Observability

Operator mindset:

  • Add dashboards
  • Set alerts

Architect mindset:

  • What signals actually matter?
  • Are we measuring user experience or system noise?
  • How does observability support decision-making?

Example 3 — Green IT

Operator mindset:

  • Reduce CPU usage
  • Optimize images

Architect mindset:

  • Can we reduce unnecessary computation entirely?
  • What is the carbon impact of this architecture?
  • Can we redesign for efficiency at system level?

Why Most Learning Resources Fail

Most content focuses on:

  • Tools
  • Tutorials
  • Implementation

Very little focuses on:

  • System design thinking
  • Trade-offs
  • long-term architecture

That’s why many engineers stay stuck between Level 2 and 3.

How to Develop Architectural Thinking

1. Study systems, not tools

Instead of:

“How does this tool work?”

Ask:

“Why does this system exist?”

2. Practice trade-off thinking

Every decision has consequences:

  • Performance vs cost
  • Speed vs reliability
  • Simplicity vs flexibility

Train yourself to see them.

3. Reverse-engineer systems

Take an existing system and ask:

  • Why is it designed this way?
  • What are the bottlenecks?
  • What would I change?

4. Use observability as a thinking tool

Observability is not dashboards.

It’s a way to understand:

  • system behavior
  • user impact
  • hidden complexity

5. Think beyond code

Architecture includes:

  • infrastructure
  • data flow
  • team structure
  • business constraints

Final Thoughts

Skill growth is not about doing more.

It’s about thinking differently.

The move from Operator to Architect is not a step up in tools.

It’s a step up in perspective.

And once you make that shift:

You stop fixing systems.
You start shaping them.

👉 If you want to understand where you stand in this journey, explore the TestingSaaS Skill Maturity Framework on testingsaas.nl.

👉 And some free advice:

Follow this course to get the architect skills needed in this age of observability and AI.

Observability Strategy Pillars: Build Real Observability Capability

turning system data into quality insights

Becoming a Data-Savvy Analyst: The Next Step in Testing Maturity

Becoming a Data-Savvy Analyst: The Next Step in Testing Maturity 592 874 Cordny

Modern software teams produce enormous amounts of data.
Logs, metrics, traces, test results, performance dashboards, and customer usage signals are generated every second.

Yet in many teams, that data is barely used.

Tests are executed. Dashboards exist. Monitoring tools run. But few people translate that data into actionable insights about quality.

This is where the Data-Savvy Analyst emerges.

In the TestingSaaS Skill Maturity Framework, becoming data-savvy means moving beyond intuition and execution toward evidence-based quality decisions.

The Traditional QA Analyst

A traditional QA Analyst already thinks more strategically than an Operator.

They:

  • Perform risk-based testing
  • Analyze requirements
  • Identify coverage gaps
  • Communicate risks to stakeholders

They answer questions like:

  • What could break?
  • Where are our risky areas?
  • What should we test before release?

But their insights often rely on experience and reasoning, not always on measurable system behavior.

And that’s where the next evolution begins.

The Data-Savvy Analyst

Image

Throughput analysis using Datadog

A Data-Savvy Analyst adds a new capability:

They use production and testing data to guide quality decisions.

Instead of asking only what might break, they ask:

  • What does the data tell us about system behavior?
  • Where do users actually experience problems?
  • Which parts of the system generate the most errors?
  • What patterns appear in logs, metrics, and traces?

This analyst connects multiple information sources:

  • Test results
  • Observability data
  • Performance metrics
  • Production incidents
  • User behavior analytics

Quality becomes measurable and observable.

Why Data Literacy Is Becoming Essential

In modern SaaS environments, systems are too complex to understand purely through testing alone.

Applications now include:

  • Microservices
  • APIs
  • Third-party integrations
  • Cloud infrastructure
  • Continuous deployment

Failures often appear in production conditions, not just in test environments.

This means quality engineers must learn to interpret operational signals such as:

  • Error rates
  • Latency spikes
  • Usage patterns
  • Resource consumption

Without this perspective, testing remains blind to real-world behavior.

The Shift from Test Results to System Insights

Traditional testing focuses on pass/fail outcomes.

Data-savvy analysis focuses on behavioral patterns.

Instead of asking:

Did the test pass?

The Data-Savvy Analyst asks:

  • How often does this endpoint fail in production?
  • Which user flows generate the most latency?
  • Which features are barely used but heavily tested?
  • Where do incidents cluster in the architecture?

Testing becomes part of a broader discipline: observing system health.

Skills That Define a Data-Savvy Analyst

Developing this capability requires new skills.

Understanding Observability Data

Data-savvy analysts work with:

  • Logs
  • Metrics
  • Distributed traces
  • Performance telemetry

Tools might include observability platforms or monitoring dashboards.

But the important skill is interpreting patterns, not just reading charts.

Asking Quantitative Questions

Data literacy begins with curiosity.

Examples of useful questions:

  • Which component causes the most incidents?
  • What percentage of traffic hits this feature?
  • How does performance change after deployment?
  • What signals indicate quality degradation?

These questions turn raw data into insights.

Connecting Testing with Production Reality

The Data-Savvy Analyst connects three worlds:

  1. Development
  2. Testing
  3. Operations

Instead of seeing testing as a separate phase, they treat quality as a continuous feedback loop.

Test results influence monitoring.
Monitoring insights influence test design.

Why Many Teams Struggle with This Transition

Despite the importance of data literacy, many teams struggle to develop it.

Common reasons include:

Tool Silos

Testing tools, monitoring platforms, and analytics dashboards are often separate.

Few teams actively connect them.

Lack of Analytical Training

Testers are trained to:

  • Design tests
  • Automate checks
  • Execute scenarios

They are rarely trained to analyze operational data.

Cultural Barriers

In some organizations:

  • QA owns testing
  • DevOps owns monitoring
  • Product owns analytics

The Data-Savvy Analyst crosses all three domains.

That requires collaboration and curiosity.

Why Data-Savvy Analysts Are Increasingly Valuable

As SaaS systems scale, quality decisions must become data-driven.

Organizations need professionals who can:

  • Interpret observability signals
  • Connect incidents with architectural weaknesses
  • Prioritize testing based on real usage patterns
  • Identify hidden reliability risks

These capabilities transform QA from a verification function into a decision-support discipline.

Practical Steps to Become a Data-Savvy Analyst

If you want to develop this capability, start with small habits.

Explore Your Monitoring Tools

Open dashboards used by DevOps teams and ask:

  • What metrics are tracked?
  • What alerts exist?
  • Which services produce the most errors?

Study Production Incidents

Every incident contains valuable learning signals.

Ask:

  • What failed?
  • What signals existed before the failure?
  • Could testing have detected it earlier?

Connect Observability with Test Strategy

Use operational data to guide testing priorities.

For example:

  • Focus tests on high-traffic features
  • Investigate areas with high error rates
  • Design performance tests based on real workloads

Testing becomes evidence-based.

The Future of Quality Engineering

The role of testing is evolving.

Operators execute tests.
Analysts reason about risk.
Data-Savvy Analysts interpret system behavior.

In modern SaaS environments, quality is no longer only about verification.

It is about understanding complex systems through data.

And the professionals who master that skill will shape the future of quality engineering.

How to become a Data-Savvy Analyst?

–> TestingSaaS Learning Resource Hub 



TestingSaaS and InnovaTeQ partner up in IT Education

TestingSaaS and InnovaTeQ combine forces to shake up Dutch IT education

TestingSaaS and InnovaTeQ combine forces to shake up Dutch IT education 790 340 Cordny

🔥HOT FROM THE PRESS 🔥

TestingSaaS and InnovaTeQ, now partners in IT Education

The last months I was deeply involved in setting up an affiliate program for the Hungarian IT course provider InnovaTeQ.
Both me and Ádám Tóth, founder of InnovaTeQ, have a vision to provide IT courses which are a mix of engineering, business and the use of tooling.
In today’s market these subjects are mostly given seperately, not giving you the big picture you need as an IT professional. Especially in the age of AI.

TestingSaaS and InnovaTeQ partner up in IT Education

So we started to work together, as content creators and affiliate partner.

Why did I become an affiliate partner with InnovaTeQ?

Because we want to introduce the Dutch market to the unique courses InnovaTeQ provides in IT.
From observability and performance testing to agile working.

Providing good value and for a good price.

A collection of InnovaTeQ courses

Here is a collection of InnovaTeQ courses:

Observability Concept Essentials

Observability Maturity Unlocked

Observability in Action – Roles & Use Cases

Time to get involved in IT education, the InnovaTeQ way!

illustrating the evolution from IT operator to IT analyst by jumping from 1 mountain to another

Why Moving in IT from Test Operator to Test Analyst Is the Hardest Step in the TestingSaaS Skill Maturity Framework

Why Moving in IT from Test Operator to Test Analyst Is the Hardest Step in the TestingSaaS Skill Maturity Framework 1800 1202 Cordny

In the TestingSaaS Skill Maturity Framework, the jump from Level 2 – Operator to Level 3 – Analyst is where most testers plateau.

Not because they lack intelligence.
Not because they lack tooling skills.

But because this transition is not about learning more tools.

It’s about changing how you think.

source: https://subud.ca/overcome-obstacles/

Level 2 – Operator: Reliable Execution

At Level 2, professionals are strong executors.

They:

  • Write and maintain automated tests
  • Execute regression suites
  • Use tools like Selenium, Playwright, Postman
  • Deliver predictable output

Success is measured in:

  • Number of tests
  • Stability of regression
  • Coverage percentage
  • Passed vs failed results

The Operator works inside the system.

They make it run.

This level is valuable. Many SaaS companies depend on strong Level 2 professionals to keep releases stable.

But it is not yet strategic.

Level 3 – Analyst: Strategic Quality Thinking

At Level 3, something changes.

The Analyst asks different questions:

  • What risks are we actually mitigating?
  • What is the business impact if this fails?
  • Where are our coverage gaps?
  • Which parts of this system are fragile?
  • Should this even be automated?

Instead of executing tests, the Analyst designs quality strategy.

They connect:

  • Requirements → Architecture → Risk → Test Approach
  • Product decisions → Quality trade-offs
  • Business goals → Technical implementation

The Analyst works on the system, not just in it.

Why This Transition Is So Difficult

1. It Requires an Identity Shift

Level 2 value = “I can build and run tests.”

Level 3 value = “I can reason about risk and complexity.”

That shift feels uncomfortable

Tool mastery gives certainty.
Risk analysis gives ambiguity.

Many professionals hesitate because they feel they are losing their strongest asset: execution speed.

2. You Must Be Comfortable Challenging Decisions

Analysts ask uncomfortable questions:

  • Why are we testing this feature?
  • What happens if we don’t?
  • Is this really high risk?
  • Are we over-automating?

That can feel confrontational, especially in delivery-driven SaaS environments.

It requires confidence and communication skills, not just technical expertise.

3. Tooling Stops Being the Center

At Level 2, tools are your identity.

At Level 3:

  • Architecture matters more than frameworks.
  • Risk matters more than coverage percentage.
  • Impact matters more than script count.

This is psychologically hard because many testers built their careers around automation expertise.

4. You Need System Thinking

Analytical maturity demands abstraction:

  • Understanding dependencies
  • Modeling data flows
  • Seeing edge cases before code exists
  • Translating business language into test strategy
  • Recognizing where failures cascade across SaaS integrations

This is cognitive growth, not procedural growth.

It takes deliberate practice.

5. Organizations Often Reward Level 2 Behavior

Many companies:

  • Say they want strategic QA
  • But measure success in test case output
  • Celebrate automation numbers
  • Prioritize speed over reflection

So professionals stay in the safe zone of execution.

And maturity stalls.

Why This Matters in SaaS Environments

In SaaS companies, especially scaling ones:

  • Releases become more frequent
  • Integrations multiply
  • Customer impact increases
  • Architectural complexity grows

Level 2 professionals keep things running.

Level 3 professionals prevent future chaos.

Without Analysts:

  • Automation becomes noise
  • Regression grows without strategy
  • Technical debt accelerates
  • Quality becomes reactive instead of proactive

This is exactly where many Salesforce partners and SaaS scale-ups struggle.

How to Move from Operator to Analyst

The shift is intentional. It does not happen automatically.

Practical steps:

  1. Start mapping risk before writing tests.
  2. Ask “What could hurt the business?” in every refinement.
  3. Study architecture diagrams.
  4. Model data flows.
  5. Participate in product discussions.
  6. Stop measuring your value in test count.

Replace:

“How do I automate this?”

With:

“Should this be automated and why?”

The Strategic Tipping Point

In the TestingSaaS Skill Maturity Framework, Level 3 is the tipping point where:

  • Quality becomes strategic
  • Testers influence decisions
  • Automation becomes intentional
  • QA starts shaping architecture discussions

It’s the difference between being a reliable executor and becoming a quality architect.

And that is why the jump feels difficult.

It requires you to grow beyond the comfort of tools into the responsibility of judgment.

If you are currently operating at Level 2, ask yourself:

Are you maintaining stability?

Or are you shaping the future risk profile of your product?

That answer defines your maturity.