An inspector arrives. They ask a Year 5 teacher, a Year 9 teacher, and a department head about technology in their lessons. The answers that land are descriptions of what students are doing with the screens, with one or two specific examples. The answers that struggle are app lists, adoption percentages, and screen-time figures. The technology question is a learning question, and inspectors are listening for a learning answer.

PICRAT is the simplest way I know to put the first kind of answer in front of you when you need it. It does not require a special CPD week, a new policy, or a procurement decision. It needs about an hour of departmental time to land and a half-term of audit to settle.

What Ofsted actually asks about technology

The deep dive is a focused inspection method that traces a subject or theme through the three quality-of-education judgements. For technology, that means three threads.

Three questions in a technology deep dive
IntentWhy is the technology in this curriculum, and what would the curriculum be missing without it?
ImplementationHow does the technology appear in the classroom day to day? Teacher-led, student-led, occasional, daily?
ImpactWhat is the difference to the learning? What would a child have learned with paper that they would or would not learn with the iPad?
The three threads inspectors trace, restated as PICRAT-shaped questions.

Each of those three is a learning question with a tool inside it. The technology is incidental to all three. The answer that holds up is one that describes learning. The answer that struggles is one that describes the technology.

How PICRAT answers each thread

PICRAT was built for these three questions, although it pre-dates the deep-dive language by a few years.

Intent maps to the column. The decision to choose Replace, Amplify or Transform is the school's intent in miniature. Technology that does what paper would have done is a deliberate choice for some lessons (a recap, a quick check) and a missed opportunity for others. A school whose intent is sharp will say so. The choice of Amplify over Replace for a Year 4 reasoning lesson is the intent showing its working.

Implementation maps to the row. What students are doing with the technology is the visible part of implementation. A school that says "students are creating, marking up and reasoning with the technology" can defend the description with examples. A school that says "students are using the apps the curriculum mandates" has answered a different question.

Impact maps to the cell. The cell tells you what the technology is adding beyond paper. A Passive Replace lesson has near-zero impact attributable to the technology, because the same lesson on paper would teach roughly the same thing. An Interactive Amplify lesson has clear impact: students could not have done this work, in this depth, on paper, in this time. A Creative Transform lesson has the highest impact when it earns its place: a published artefact, a peer-reviewed argument, a primary-source comparison the lesson would not have managed otherwise.

Three judgements. One grid. The same description works for all three.

A deep-dive paragraph that holds up

Here is the kind of paragraph a Head can read into a deep-dive conversation, on the back of a PICRAT audit.

In Year 5 maths, students use a number-line tool to manipulate fractions while reasoning aloud with a partner. We placed that lesson at Interactive Amplify: the tool extends what students could do with paper manipulatives, and it forces them to articulate the move they are making. In Year 8 history, students record a 60-second piece-to-camera arguing whether William's consolidation strategy was just, citing evidence on screen. We placed that lesson at Creative Transform: the recorded argument with citations is a learning artefact the lesson could not produce on paper. Across the school, we audit lessons that use technology against this grid each half-term. Most lessons sit at Interactive Amplify, which is the cell where we see the most consistent gain over a paper-only equivalent. Where lessons sit at Passive Replace we ask whether the placement was a deliberate choice (often for a recap or a recall lesson) or a drift we want to nudge.

That paragraph names two specific lessons, classifies them, defends the placement, and shows the school's audit habit. It is the deep-dive answer that survives.

What gets you in trouble

Three answers reliably struggle in a deep-dive conversation about technology.

The first is an apps list. "We use Google Workspace, iPads in Years 3 to 6, MS Office in Years 7 to 13, and Kahoot for retrieval practice." That answers a procurement question. The deep dive is asking a learning question.

The second is a screen-time figure. "Students are on iPads for an average of 90 minutes a day across the curriculum." Same problem. The number is precise. It does not describe learning.

The third is an adoption rate. "94% of teachers have completed the Apple Education Foundations training." That answers a CPD-budget question. The inspector will ask, politely, what difference the training has made to lessons. If the answer to that question is also a number, the conversation will keep moving away from the lesson.

The fix is the same in all three cases. Lead with one specific lesson. Classify it on PICRAT. Defend the placement. Show the audit habit that produced the placement. The conversation moves back to learning, which is where the inspector wants it.

What to do tomorrow

Three steps that will land before half-term.

First, audit the technology-using lessons in one year group on the PICRAT grid. Twenty lessons is enough. The dots will cluster, and the cluster is the conversation. Do one year group, well, before scaling.

Second, write the deep-dive paragraph for that year group. Two specific lessons, classified, defended, with the audit habit named. Keep it under 200 words. Print it. Put it on the inside of a folder that lives in the office.

Third, repeat with one more year group every half-term. By the end of the school year, every year group has its paragraph, and the school has a unified language for learning with technology that an inspector can hear in the same sentence the teachers are using.

If you want the audit done for you on a specific lesson, drop the plan into Review. It will classify the lesson on PICRAT, surface the strengths, flag the weaknesses, and suggest the smallest move that would change its placement. Use the output as the spine of the paragraph.

Talk about learning in the language of activity.

The schools that handle the technology question well in a deep dive are the ones that answer in the same language they use to teach. The framework gives you that language. Ofsted is asking you to describe learning. PICRAT is one way to describe it that holds up under cross-examination.

Andy Perryer is the head of digital learning at a group of international schools and the creator of PICRAT Suite. The PICRAT framework was developed by Royce Kimmons, Charles Graham and Richard West in their 2020 paper in the CITE Journal.