What Is Training Needs Assessment (TNA)? The Complete 2026 Guide

Here’s a scenario every L&D professional recognizes. The email arrives on a Tuesday: “We need soft skills training for the team.” No performance data attached. No business outcome specified. A manager has already decided the …

training needs assessment

Here’s a scenario every L&D professional recognizes. The email arrives on a Tuesday: “We need soft skills training for the team.” No performance data attached. No business outcome specified. A manager has already decided the solution before anyone has diagnosed the problem.
If you’ve worked in L&D for more than six months, you’ve received that email. And if you simply delivered what was asked, designed the course, booked the room, marked ‘complete’ in the LMS, you also know what happens next: the performance problem doesn’t move.
That’s the world that makes training needs assessment essential. Not as a bureaucratic checkbox, but as the thing that separates learning that actually changes behavior from learning that fills calendars.

Table of Contents

What Is Training Needs Assessment?

A training needs assessment (TNA) is a systematic process organizations use to identify the gap between an employee’s current skills, knowledge, or behavior and what is required to achieve specific business outcomes, before any training is designed or delivered.

According to the Association for Talent Development (ATD), needs assessment is: the process of gathering data to determine what individual performers need for the organization to accomplish its stated goals. TNA answers one question first: is training actually the right solution, and if so, what kind, for whom, and why?

In HRM, TNA sits at the front of the learning and development lifecycle. It is the diagnostic phase, equivalent to a doctor ordering tests before prescribing treatment. Skip it, and you risk treating the wrong disease. [/pro-tip]

What Is Training Needs Assessment in HRM?

In human resource management, a training needs assessment is the entry point to the entire learning and development process. Before a slide deck gets built, a facilitator gets booked, or a module goes live, TNA answers: do we actually have a training problem?

That question is deceptively hard. Because sometimes what looks like a skill gap is a motivation problem. Sometimes it’s a process failure, a tool problem, a resourcing constraint, or a management behavior that training will never fix. A well-run TNA distinguishes between these, and saves organizations from spending serious budget solving the wrong problem.

The training needs assessment meaning in HRM extends beyond identifying topics. Its purpose is to connect learning investment to measurable business outcomes, to link organizational strategy to job-level requirements to individual capability gaps in one coherent, evidence-based picture.

As Training Industry puts it: “most mistakes in aligning learning investments come from being imprecise in defining business needs and shifting into delivery mode too early.” TNA forces the pause before that delivery mode kicks in. In doing so, it prevents the most common failure mode in corporate L&D: training that happens, and changes nothing.

Why TNA Matters More Than Ever: The Data Says So

The numbers are stark. Only 11% of employees fully apply new training back on the job (24×7 Learning Research). Only 15% of business leaders believe learning is a core part of company strategy (Docebo 2024). And despite record per-employee spend, around $1,400 per head in large organizations, outcomes remain weak.

The culprit, more often than not, is not poor delivery. It’s poor diagnosis.

Here’s what makes training needs assessment importance so high-stakes: when L&D teams push programs that don’t connect to real performance problems, they don’t just waste budget. They erode credibility. Managers stop releasing teams for training. Business leaders start questioning L&D’s value. Budgets shrink, which reduces capacity for next year, accelerating the decline.

A rigorous TNA breaks that cycle. It gives L&D the data to speak leadership’s language: here is the gap, here is its business cost, here is how training closes it, and here is how we’ll measure success.

Training needs assessment benefits:

  • Prioritizes learning investment toward gaps with the highest business impact
  • Prevents ‘training theater’, programs that look productive but change nothing
  • Identifies non-training root causes: process failures, tool gaps, motivation issues, management behaviors
  • Personalizes learning paths, 59% of millennials consider training offerings when evaluating an employer (SkillsCaravan, 2024)
  • Gives managers a role in defining the problem, dramatically improving learner buy-in and adoption
  • Signals to employees that their development is taken seriously, not just administered

“Training needs analysis is critical if you want to ensure you don’t waste resources, time, and energy. When done correctly, people learn more quickly, there is a greater impact on job performance, and it reduces the frustration that comes with taking on new roles.”
— Emily Chipman, Executive Coach, Rushman Consulting Solutions (via AIHR)

The 3 Levels of Training Needs Assessment

Most solid training needs assessment frameworks are built on the three-level model first proposed by McGehee and Thayer in 1961, and still the backbone of every serious TNA today. OPM, AIHR, CIPD, and ATD all anchor their guidance on it. Here’s how it works in practice:

Level 1: Organizational Training Needs Assessment

The macro view. An organizational training needs assessment asks: where is the business going, and what capabilities does it need to get there? This level examines company strategy, technology adoption roadmaps, regulatory changes, market shifts, workforce projections, and succession gaps.

Key inputs: strategic plans, leadership interviews, HR analytics dashboards, performance scorecards, post-merger integration documentation.

This level answers: what should L&D even be prioritizing this year? Without it, everything at Levels 2 and 3 risks being disconnected from what the business actually needs.

Level 2: Task / Job-Level Assessment

Now you zoom into specific roles. What does excellent performance in this job look like, not on the job description, but in observed reality? What knowledge, skills, and behaviors (KSBs) does it require? Moreover, What are the observable performance standards?

Key inputs: job task analysis, SME and high-performer interviews, work observation, performance standards, role profiling data.

This level matters because it creates the target state, the definition of ‘good’, against which you’ll measure individual gaps at Level 3. Skip this and your gap analysis has no reliable standard to measure against.

Level 3: Individual Training Needs Assessment

The most granular level. An employee training needs assessment at this level examines specific people, performance reviews, 360-degree feedback, competency self-assessments, direct manager observations, to identify where each individual falls short of the Level 2 standard.

This level matters because two people with the same job title can have entirely different development needs. Treating them identically wastes time for one and leaves critical gaps in the other.

Level Name Core Question Key Data Sources Triggered When…
L1 Organizational Where is the business going, and what capabilities does it need? Strategic plans, leadership interviews, HR dashboards, workforce projections New market entry, ERP rollout, regulatory change, merger, digital transformation
L2 Task / Job What does excellent performance in this role actually look like? Job task analysis, SME interviews, performance standards, role profiling High error rates in a function, new role creation, promotion pipeline gaps
L3 Individual Where does each specific person fall short of that standard? Performance reviews, 360 feedback, competency assessments, manager observations Onboarding, PIP situations, high-potential development, post-promotion support

⚠️ The Most Expensive TNA Mistake

Organizations routinely jump straight to Level 3, sending out employee surveys or launching competency assessments, without anchoring those findings to Levels 1 and 2 first.

The result: training that genuinely improves individual skills that no one at the business level needed improved. Every step was executed correctly. The strategic ROI was zero.

Always work top-down. Organizational → Job Level → Individual. This sequence is not optional, it’s the logic that makes everything downstream relevant.

Types of Training Needs Assessment

There is no single training needs assessment model that fits every context. The type you choose depends on what triggered the need in the first place. Here are the types practitioners encounter most frequently:

Performance-Based TNA

Triggered by declining KPIs, rising error rates, quality failures, customer complaint spikes, or post-audit gaps. You’re diagnosing why performance dropped, and determining whether training is the right fix, or whether the root cause is upstream of learning entirely.

Compliance-Driven TNA

Triggered by regulatory requirements, policy updates, safety standards, or legal changes. Often non-negotiable in scope and timing. In healthcare, financial services, and manufacturing, compliance TNA is a recurring, structured process with legal consequences if not completed properly.

Proactive / Strategic TNA

Triggered by strategic planning: a new product launch, technology deployment, market expansion, or workforce transformation initiative. You’re identifying capability gaps before they become performance problems. This is where the best-resourced L&D teams spend most of their time, and what separates strategic L&D functions from reactive ones.

Change-Readiness TNA

Triggered by transformation: mergers, restructuring, digital overhaul, AI adoption across functions. You’re assessing whether your workforce has the capability, and mindset, to navigate significant organizational change. In 2026, this type increasingly overlaps with change management and organizational development.

New Hire / Onboarding TNA

Specific to the onboarding context: diagnosing what new employees need to reach productivity quickly, and identifying where the current onboarding experience has gaps. Chronically underinvested, despite the documented impact of effective onboarding on 90-day retention and first-year performance.

Training Suitability Analysis, The Step Most Organizations Skip

Before any TNA produces a training recommendation, experienced practitioners run a training suitability analysis, a formal check on whether training is the right answer at all. Not all business problems are solved by learning; not all behaviors are changed through training. Documenting why training is appropriate for this specific gap creates accountability, and protects L&D from being set up to fail.

The Training Needs Assessment Process: 7 Steps

The training needs assessment process varies across published models, but experienced practitioners converge on a consistent structure. Here’s a field-tested seven-step framework, built from ATD, AIHR, CIPD, CDC Quality Training Standards, and real practitioner guidance:

Step 1: Define the Business Outcome First

Don’t start with ‘what training do we need?’ Start with: what specific, measurable business outcome are we trying to achieve? Revenue targets, error rate reduction, attrition improvement, compliance rates, time-to-productivity? Anchoring the TNA to a concrete business goal keeps every decision downstream relevant, and gives you a success benchmark to measure against.

Step 2: Identify and Quantify the Performance Gap

Where is current performance? Where does it need to be? The gap between those two points, specific and measurable, is your working hypothesis. ‘The team needs better communication’ is not a gap. ‘First-call resolution is at 61% against a 78% target for the past two quarters’ is a gap. Vague gaps produce generic training that changes nothing.

Step 3: Investigate Root Causes (The Critical Step)

This is the step that separates rigorous TNA from box-ticking TNA. Asking why the gap exists before deciding what to do about it.
Is it a knowledge gap (they don’t know how)? A skills gap (they know how but can’t execute fluently)? A motivation problem (they can do it but choose not to)? A broken process (the system makes it hard)? A resource constraint? CDC’s Quality Training Standards framework identifies root cause investigation as the foundational step in any needs assessment, because the root cause determines whether training is appropriate at all.

Step 4: Run the Training Suitability Check=i

Formally ask, and document, whether training will close this specific gap. If the root cause is motivation, training won’t fix it. If it’s a broken workflow, training won’t fix it. Furthermore, if it’s a management behavior, no L&D program in the world will fix it. This check creates organizational clarity about the real solution, and protects L&D from being blamed for failing to solve problems that were never training problems.

Step 5: Collect Data Using the Right Methods

Now you gather evidence using your chosen TNA methods, surveys, interviews, observations, performance data, competency assessments, LMS analytics, focus groups. The right combination depends on timeline, budget, and population. Quality of evidence matters far more than quantity of methods. (See Section 6 for detailed method-by-method guidance with practitioner tips.)

Step 6: Analyze, Prioritize, and Name Non-Training Needs

Not all gaps are equally urgent or equally addressable through training. Rank findings by business impact and trainability. Explicitly identify what training won’t solve, this is where L&D earns lasting credibility rather than perpetually delivering requests.

Step 7: Report with Recommendations, Not Just Findings

A TNA ends with a concrete deliverable: not just a list of gaps, but a prioritized set of recommendations with proposed interventions, target populations, delivery formats, timelines, and measurable success criteria. As ATD frames it: the goal is to ‘present findings to gain acceptance, build credibility, and collaborate with stakeholders.’ The report turns diagnosis into action.

Methods & Tools, and What Guides Won’t Tell You About Each

The training needs assessment methods you choose determine the quality of your insights. Every method has strengths and failure modes. Here’s an honest breakdown with the practitioner reality that most published guides quietly skip:

Method Best For Effort Practitioner Reality
Employee Surveys Large or distributed teams Low Use anonymous formats, employees won’t admit gaps they find embarrassing on named forms
Structured Interviews Senior roles, SMEs, nuanced functions High Interview top performers first to define what ‘excellent’ looks like before mapping gaps
Focus Groups Team-level patterns and culture signals Medium Bring similar roles together; cross-functional groups muddy the signal
Observation / Shadowing Operational, frontline, technical roles High Most underused, highest signal. What people say they do and what they do rarely match

Performance Data Review

Any role with measurable KPIs Medium Start here before anything else, your ops data already tells the story
360-Degree Feedback Behavioral and leadership gaps Medium Most useful at individual level; watch for peer rating bias in team-tight cultures
Competency Assessments Technical and compliance-heavy roles Low–Med Only works if your competency framework is current, most aren’t
LMS / LXP Analytics Scalable, ongoing monitoring at scale Low Completion rate ≠ capability. Re-attempt rates and quiz scores tell the real story

💡 The Anonymous Survey Advantage (Rarely Mentioned)

One insight from experienced practitioners that almost never appears in published TNA guides: anonymous survey formats return significantly more honest data than named ones.

As one L&D consultant noted: “Admitting that we don’t have the right level of skill necessary for our jobs can feel quite risky and vulnerable.” Employees in named surveys tend to overstate competence or report what they think managers want to hear.

For an accurate picture of where skill gaps actually exist, especially in performance-sensitive or compliance-critical roles, anonymous surveys give you the truth that named surveys suppress. If your TNA requires named assessments (360 reviews, competency ratings), consider running a brief anonymous survey first to capture the unguarded signal.

Who Conducts a TNA, and When They Should Be Brought In

The formal answer: L&D professionals, HR business partners, organizational development consultants, or some combination. In large enterprises, TNA is a dedicated function with its own methodology. In smaller organizations, it’s often one HR generalist doing everything.

But the more important question is when L&D gets brought in, and this is where most organizations get it consistently wrong.

L&D teams typically receive training requests after the decision has already been made. A manager has identified a problem, decided training is the answer, and is now looking for a vendor or a course. By the time L&D is in the room, the diagnosis is already locked in. Running a TNA at that stage feels like friction, and practitioner communities consistently report stakeholder resistance to needs assessment as a real, recurring professional challenge.

This is significant enough that ATD’s certificate program on needs assessment includes a dedicated module on addressing stakeholder resistance to TNA, which tells you everything about how common and consequential this dynamic is. Stakeholders want solutions. They want them fast. Getting them to pause on diagnosis requires credibility, political capital, and the ability to reframe TNA not as delay but as risk reduction.

What works: being present at the business problem stage, in leadership meetings, project kickoffs, quarterly reviews, rather than waiting to be invited after the solution has already been packaged as a training request. Presence upstream is the highest-leverage move in any L&D professional’s practice.

How Long Does a TNA Take?

This is one of the most common questions practitioners get asked, and one that almost no published guide actually answers. Based on practitioner benchmarks and industry data, a typical TNA runs between 1 and 8 weeks, depending on scope:

Small / Targeted Mid-Size Org Large Enterprise Enterprise + Multi-Function
Timeline 1–2 weeks 2–3 weeks 3–6 weeks 6–8 weeks
Approach Single role or team. 1–2 methods (survey + interviews). Cross-functional. 2–3 methods. Manager and employee data. Multiple BUs or regions. Mixed-method. Stakeholder workshops. Full workforce TNA. Six-plus data sources. Board-level sign-off.

These timelines assume cooperative stakeholders. In practice, the biggest delays are: stakeholder availability for interviews, data access from operations teams who don’t routinely share metrics with HR, and organizational ambiguity about who actually owns the performance problem. Budget extra time for all three.

How often should TNA be repeated? At the organizational level, formally annually, aligned to the business planning cycle. At team and individual levels, more continuously, woven into performance review cycles, quarterly manager check-ins, and event-triggered reviews (system rollouts, compliance updates, significant attrition spikes, NPS drops).

In 2026, with AI-powered skills intelligence tools gaining adoption, more organizations are moving toward always-on skills gap monitoring, using LMS and HRIS data to maintain a continuously updated view of capability gaps, supplemented by formal TNA when specific business initiatives require it.

TNA vs. Training Needs Analysis vs. Learning Needs Assessment

Training Needs Assessment vs. Training Needs Analysis

Most practitioners use these terms interchangeably. ATD makes a formal distinction: needs assessment is the data-gathering phase (identifying what gaps exist and whether training is appropriate), while needs analysis is the interpreting phase (synthesizing that data to design specific learning objectives and solutions). Assessment comes first; analysis follows.

In practice, almost no organization separates these into distinct projects. They run as one continuous workflow. But understanding the distinction helps L&D frame their work more precisely when explaining to stakeholders why the process takes 2-6 weeks rather than 2 days. You can use the Training Needs Analysis Template to identify training gaps in your organization.

Training Needs Assessment vs. Learning Needs Assessment

This is the more substantive distinction. According to the CIPD, a Learning Needs Assessment (LNA) is broader than a TNA. It’s an ongoing organizational health check on skills, talent, and capabilities, gathered systematically across multiple stakeholders, continuously, rather than in response to a specific performance problem.

A TNA is typically project-specific and time-bounded: triggered by a particular business need, completed in weeks, resulting in a specific training recommendation. An LNA is the continuous intelligence system that makes future TNAs faster and more accurate by maintaining current data on organizational capability.

Modern L&D functions that have the infrastructure, and often the LMS and HRIS data, are building toward LNA. Most organizations still operate at TNA level. That’s a completely valid starting point.

💡 Real-World Failure: What Happens When TNA Comes Too Late

In a documented case shared by facilitator Deborah Rim Moiso (SessionLab), a large public nature conservation project commissioned multi-stakeholder participatory workshops, without assessing the actual needs of the locals involved first.

Months into design and facilitation planning, the team discovered the real situation: the farmers and cattle-breeders involved simply needed updated technical knowledge from Subject Matter Experts, not collaborative problem-solving sessions at all.

The solution design had to be rebuilt from scratch. Budget, goodwill, and weeks of design work were lost.

The lesson holds in every industry: no matter how well-conceived a learning intervention looks on paper, TNA must come before design, not during it.

The Practitioner’s Reality: What Nobody Teaches You About TNA

Every L&D certification, every competency framework, every article on this topic tells you how TNA should work in theory. Here’s what practitioners in L&D communities, across platforms, forums, and peer conversations, actually report on the ground:

You’ll often arrive after the solution has already been decided

The L&D Academy’s Dani Bacon describes it perfectly: practitioners receive requests like “I’ve set 2026 goals for my team, and now I need to train them in negotiation and customer experience, can you help?” The manager has already identified a symptom and jumped to a solution, without involving L&D in the diagnosis. This isn’t a failure. It’s the default. Building the skill to diplomatically re-open the diagnosis, without being seen as obstructive, is one of the most important and least discussed competencies in the L&D profession.

Stakeholder resistance to TNA is real and common, enough for ATD to teach it

Business stakeholders want speed. TNA takes time. ‘We already know what the problem is, just build the training’ is an extremely common response. Having a practiced, evidence-based counter to this, one that frames skipping TNA as a financial risk, not a process preference, is essential. ATD dedicates curriculum time specifically to navigating this resistance. The fact that a major professional body felt it necessary to formalize this module should tell you how widespread the challenge is.

Your best data is probably already sitting in your building

Experienced practitioners consistently report that organizations have far more relevant performance data than they realize. CRM win/loss logs, quality audit records, NPS scores by team, exit interview themes, onboarding survey results, LMS re-attempt rates, call recording analysis, all of these are TNA data that already exists. The bottleneck is usually access:

L&D doesn’t have relationships with the operations and analytics teams who hold this data. Building those relationships is a higher-leverage investment than any survey template.

Recommending ‘no training needed’ is a win, not a failure

Many practitioners report subtle internal pressure to always recommend training, because recommending against it can feel like arguing against the team’s own relevance. This framing is dangerous. L&D teams that can confidently conclude ‘this is not a training problem, the fix belongs in process redesign, management behavior, or incentive structure’ are the ones that earn lasting strategic credibility and are invited into business decisions earlier. Teams that can only recommend training eventually become perceived as order-takers rather than advisors.

Best Training Needs Analysis Practices for 2026

If you’re running a training needs assessment this year, here’s what separates the ones that drive real performance change from the ones that produce a report nobody acts on:

Anchor everything to a measurable business outcome

‘We need a leadership program’ is not a TNA brief. ‘We’re losing 34% of first-line managers within 18 months, exit interviews point to isolation and lack of coaching, and this costs approximately $420K annually in replacement costs’ is. Start with the number. Build the TNA to answer whether and how training can move it.

Build cross-functional data access before you need it

The best TNAs combine L&D expertise with hard operations data. Build relationships with analytics teams, operations managers, and finance well before any specific TNA begins. Without performance data, your TNA is an opinion poll with a methodology. With it, it’s a business case with evidence.

Use anonymous survey formats for honest gap data

Admitting skill deficiencies feels risky for employees, particularly in competitive or performance-evaluated roles. Anonymous surveys remove social risk and produce significantly more accurate needs data. If your process requires named competency ratings later, run an anonymous pre-survey first to get the unguarded picture.

Mine your LMS and LXP data before sending a single survey

If your organization runs a learning platform, you already have a skills intelligence dataset. Re-attempt rates, assessment score distributions, module abandonment curves, pathway drop-off points, all of these signal where real gaps exist, at scale, without any additional data collection. Most L&D teams treat their LMS as a content delivery mechanism. Start treating it as your first TNA data source.

Interview top performers before assessing everyone else

You cannot measure a gap without a standard to measure against. Most TNA processes go straight to surveying average employees, which tells you where people feel uncertain, not where excellent performance actually requires capability. Interviewing your best two or three performers per role first gives you a precise target state. That makes every gap measurement that follows far more accurate and defensible.

Build AI-driven skill shifts into every TNA scope in 2026

Any training needs assessment that doesn’t account for how AI is reshaping job functions in your industry is incomplete before it’s finished. AI literacy, prompt design, human-AI collaboration, judgment in AI-assisted decision-making, and critical evaluation of AI outputs are emerging requirements across nearly every professional role. The LinkedIn Workplace Learning Report 2024 confirms that 90% of L&D professionals see proactive skill-building as essential to navigating the future of work. If your TNA doesn’t surface these specifically, revisit the scope.

Deliver recommendations, not just findings

A TNA report that lists gaps without prioritizing them or proposing next steps is not a complete deliverable. Stakeholders need to see: here is what we recommend, here is why, here is who needs it, here is the proposed format and timeline, and here is how we’ll know it worked. Without this, even the most rigorous diagnostic effort produces no action, and no credit.

FAQ: Training Needs Assessment

Q1. What is training needs assessment in simple terms?

A training needs assessment is the process of determining what training employees genuinely need, who needs it, and why, before any program is designed or delivered. It identifies the gap between current capabilities and what the business requires, determines whether training is the right way to close that gap, and if it isn’t, points toward what the actual solution is. It’s the diagnostic before the prescription.

Q2. What are the 3 levels of training needs assessment?

The three levels are: (1) Organizational, aligning training with company strategy, technology, and workforce planning; (2) Task/Job, defining what excellent performance in a specific role requires; (3) Individual, assessing the gaps specific employees have against that standard. All three should inform a complete TNA, always working top-down. The most expensive mistake is jumping straight to Level 3 without Levels 1 and 2 to anchor it.

Q3. What is the purpose of training needs assessment in HRM?

The primary purpose is to ensure training investment is directed at the right problems, for the right people, for the right reasons, based on evidence, not assumption or urgency. ATD defines it as gathering data to determine what performers need for the organization to achieve its goals. Equally important: TNA determines when training is not the right solution, protecting organizations from investing in programs that can’t move the metrics they care about.

Q4. What methods are used in training needs assessment?

Common methods include employee surveys (anonymous formats recommended), structured manager and SME interviews, focus groups, performance data review, job observation and shadowing, competency assessments, 360-degree feedback, and LMS/LXP analytics. Most effective TNAs combine two to three methods. Performance data review should always come first, your existing data will tell you more than any survey, faster.

Q5. How is TNA different from training needs analysis?

In most real-world L&D practice, the terms are interchangeable. ATD formally distinguishes them: assessment is the data-gathering phase (identifying what gaps exist and whether training is appropriate), while analysis is the interpreting phase (synthesizing data into specific learning objectives and solutions). Both happen as part of the same pre-design workflow, they’re phases in a process, not separate projects.

Q6. Who should conduct a training needs assessment?

L&D professionals, HR business partners, or OD consultants typically lead TNA. In smaller organizations, one HR generalist may handle the entire process. What matters more than who does it is when they’re brought in, L&D produces the most accurate, actionable TNA when involved at the business problem stage, not after a training solution has already been packaged by a manager.

Q7. How long does a training needs assessment take?

Typically 1–8 weeks depending on scope. A single-role or single-team TNA can be completed in 1–2 weeks. A full organizational TNA spanning multiple functions or business units can take 6–8 weeks. The biggest time variables are stakeholder availability for interviews, access to operations data, and organizational clarity on who owns the performance problem. Build buffer for all three.

Q8. How often should a training needs assessment be conducted?

At the organizational level, formally annually, aligned to the business planning cycle. At team and individual levels, more continuously, triggered by performance reviews, quarterly check-ins, system rollouts, compliance changes, or significant performance drops. In 2026, leading organizations are moving toward always-on capability monitoring using LMS analytics and HRIS data, supplemented by formal TNA when specific initiatives require deeper investigation.

James Smith

Written by James Smith

James is a veteran technical contributor at LMSpedia with a focus on LMS infrastructure and interoperability. He Specializes in breaking down the mechanics of SCORM, xAPI, and LTI. With a background in systems administration, James