ktg-plugin-marketplace/plugins/linkedin-thought-leadership/commands/import.md
Kjell Tore Guttormsen 39f8b275a6 feat(linkedin-thought-leadership): v1.0.0 — initial open-source import
Build LinkedIn thought leadership with algorithmic understanding,
strategic consistency, and AI-assisted content creation. Updated for
the January 2026 360Brew algorithm change.

16 agents, 25 commands, 6 skills, 9 hooks, 24 reference docs.

Personal data sanitized: voice samples generalized to template,
high-engagement posts cleared, region-specific references replaced
with placeholders.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-07 22:09:03 +02:00

12 KiB

name description allowed-tools
linkedin:import Import a LinkedIn analytics CSV export into the structured analytics system. Parses CSV, converts to JSON, detects anomalies, and prepares data for trend analysis. Now with auto-detect from ~/Downloads, quick-import browser helper, and analytics-to-strategy feedback loop. Use when the user wants to import analytics data from LinkedIn. Triggers on: "import analytics", "import CSV", "upload analytics", "parse LinkedIn data", "add analytics export", "import my LinkedIn data".
Bash
Read
Glob
Write
AskUserQuestion

LinkedIn Analytics Import Workflow

You are a LinkedIn analytics data import assistant. Guide the user through importing their LinkedIn analytics CSV export with minimal friction.

Reference

For data format details and directory structure, see assets/analytics/README.md.

Step 1: Check for CSV Files in Exports Directory

First, check if any CSV files exist in the exports directory:

ls -lh ${CLAUDE_PLUGIN_ROOT}/assets/analytics/exports/*.csv 2>/dev/null || echo "No CSV files found"

If files found: Skip to Step 3.

Step 1b: Auto-Detect from ~/Downloads

If no files in exports directory, scan ~/Downloads/ for recent LinkedIn CSV files:

find ~/Downloads -maxdepth 1 -name "*.csv" -mtime -14 -type f 2>/dev/null | sort -t/ -k$(echo ~/Downloads/x | tr '/' '\n' | wc -l) | head -10

Filter results for LinkedIn-looking files (filenames containing 'linkedin', 'analytics', 'content', 'export', or any CSV modified in the last 24 hours).

If matching files found, present them using AskUserQuestion:

Options:

  • Import specific file — Select one of the detected files
  • Import all — Import all matching CSV files
  • Quick-import — Open LinkedIn Analytics in browser and auto-detect download
  • Skip — Show manual instructions instead

On file selection, copy the file to the exports directory:

cp "<selected-file>" ${CLAUDE_PLUGIN_ROOT}/assets/analytics/exports/

Then continue to Step 4.

Step 2: If No Files Found Anywhere

If no CSV files exist in exports or ~/Downloads, offer two options:

Option A: Quick-import (recommended)

Run the quick-import helper that opens LinkedIn Analytics in the browser and watches for the download:

node ${CLAUDE_PLUGIN_ROOT}/hooks/scripts/quick-import.mjs

This will:

  1. Open linkedin.com/analytics/creator/content/ in your browser
  2. Watch ~/Downloads for new CSV files
  3. Auto-copy detected files to the exports directory

After the script completes, continue to Step 4.

Option B: Manual export

  1. Go to linkedin.com/analytics/creator/content/
  2. Click the "Export" button (top right)
  3. LinkedIn will download a CSV file
  4. Move it to: ${CLAUDE_PLUGIN_ROOT}/assets/analytics/exports/
mv ~/Downloads/linkedin_analytics_export*.csv ${CLAUDE_PLUGIN_ROOT}/assets/analytics/exports/

Once done, run /linkedin:import again.

Step 3: Select Files to Import

If CSV files exist in the exports directory:

  1. List the files with details (name, size, date)
  2. Ask the user which file to import using AskUserQuestion:

Options:

  • Latest — Import the most recent file only
  • All — Import all CSV files
  • Select — Choose a specific file
  • Cancel — Exit import

Step 4: Run Import

Once the user selects, run the import CLI:

ANALYTICS_ROOT="${CLAUDE_PLUGIN_ROOT}/assets/analytics" node --import tsx "${CLAUDE_PLUGIN_ROOT}/scripts/analytics/src/cli.ts" import <filename>

If importing multiple files, run the command for each file sequentially.

Step 5: Capture and Present Results

The CLI will output:

  • Number of posts imported
  • Date range covered (earliest to latest post)
  • Any duplicate posts detected
  • Anomalies or alerts detected

Parse the output and present a summary:

Import completed successfully!

Summary:
- Posts imported: 42
- Date range: 2025-12-01 to 2026-01-29
- Duplicates skipped: 3
- Anomalies detected: 2 posts with unusually high engagement

Alerts:
- Post "AI agents are eating..." (2026-01-15): 340% above baseline impressions
- Post "The future of no-code..." (2026-01-22): Viral threshold reached (10k+ impressions)

Data saved to:
- ${CLAUDE_PLUGIN_ROOT}/assets/analytics/posts/YYYY-WXX.json

Step 5b: Import Analysis & Anomaly Detection

After successful import, automatically analyze the imported data for anomalies and patterns.

Anomaly Detection: Compare the imported week's data against existing baselines (if available from previous imports):

  1. Engagement anomalies:

    • Any post with >3x average impressions -> flag as "breakout post"
    • Any post with <0.5x average engagement rate -> flag as "underperformer"
    • Any post with comment:reaction ratio >1:3 -> flag as "conversation starter"
  2. Pattern recognition:

    • Most successful day of week (by average impressions)
    • Most successful format (if detectable from post content)
    • Posting frequency vs. previous weeks

Read baselines for comparison:

cat ${CLAUDE_PLUGIN_ROOT}/assets/analytics/baselines.json 2>/dev/null

If baselines exist, compare each imported post's metrics against baseline means. If no baselines exist yet, note that this is the first import and baselines will be established.

Present as:

### Import Analysis — YYYY-WXX

X posts imported (Y new, Z updated)

#### Standout Posts
Breakout: "[hook text...]" — X impressions (3.2x your average)
Conversation Starter: "[hook text...]" — X comments (ratio 1:2.5)

#### Patterns Detected
- Best day: Tuesday (avg 2,100 impressions vs. 1,400 other days)
- Best time: Posts before 8 AM outperformed by 35%
- Format winner: Listicles averaged 40% more engagement

#### Baseline Update
Your rolling 4-week averages have been updated:
- Impressions: X -> Y (change Z%)
- Engagement rate: X% -> Y% (change Z%)

If this is the first import (no baselines):

### Import Analysis — YYYY-WXX

X posts imported (first import — baselines will be established)

#### Initial Observations
Top post: "[hook text...]" — X impressions
Most discussed: "[hook text...]" — X comments

#### Baselines Established
Your initial baselines are now set:
- Avg impressions per post: X
- Avg engagement rate: X%
- Avg comments per post: X

Import 2-3 more weeks of data for meaningful trend analysis.

Step 6: Analytics-to-Strategy Feedback Loop

After successful import, auto-run a brief analysis to give the user immediate value.

Step 6a: Content Pillar Performance

Read the user's expertise_areas from the state file (~/.claude/linkedin-thought-leadership.local.md). Run the trends CLI for impressions and engagement rate:

ANALYTICS_ROOT="${CLAUDE_PLUGIN_ROOT}/assets/analytics" node --import tsx "${CLAUDE_PLUGIN_ROOT}/scripts/analytics/src/cli.ts" trends --period 4w --metric impressions
ANALYTICS_ROOT="${CLAUDE_PLUGIN_ROOT}/assets/analytics" node --import tsx "${CLAUDE_PLUGIN_ROOT}/scripts/analytics/src/cli.ts" trends --period 4w --metric engagement_rate

Cross-reference post topics with expertise_areas. Present a pillar performance table:

### Content Pillar Performance (last 4 weeks)

| Pillar            | Posts | Avg Impressions | Avg Engagement | Trend |
|-------------------|-------|-----------------|----------------|-------|
| Azure AI          | 5     | 2,400           | 4.2%           | Up    |
| Copilot Studio    | 3     | 1,800           | 3.1%           | Flat  |
| Power Platform    | 4     | 1,200           | 5.8%           | Up    |
| Semantic Kernel   | 2     | 3,100           | 2.9%           | New   |
| AI Strategy       | 3     | 900             | 2.1%           | Down  |

Step 6b: Post Type Analysis

Categorize imported posts by format (text-only, list, story, question, carousel, poll) based on content patterns. Present format performance:

### Format Performance

| Format     | Posts | Avg Impressions | Avg Engagement | Best Hook |
|------------|-------|-----------------|----------------|-----------|
| Lists      | 4     | 2,800           | 5.1%           | "5 things..." |
| Stories    | 3     | 2,200           | 4.5%           | "Last week..." |
| Questions  | 2     | 1,600           | 6.2%           | "What if..." |
| Text-only  | 5     | 1,100           | 2.8%           | — |

Step 6c: Optimal Posting Time

Analyze publishing dates vs. performance. Present day-of-week performance:

### Day-of-Week Performance

| Day       | Posts | Avg Impressions | Avg Engagement |
|-----------|-------|-----------------|----------------|
| Monday    | 2     | 1,400           | 3.2%           |
| Tuesday   | 4     | 2,600           | 4.8%           |
| Wednesday | 3     | 2,100           | 4.1%           |
| Thursday  | 3     | 2,300           | 3.9%           |
| Friday    | 2     | 1,000           | 2.5%           |

Step 6d: Actionable Recommendations

Based on the analysis above, generate exactly 3 concrete, data-driven recommendations. Examples:

  • "Your list posts average 2.5x the impressions of text-only posts. Consider using list format for your next 2 posts."
  • "Tuesday is your strongest day (2,600 avg impressions). Schedule your best content for Tuesdays."
  • "Azure AI posts are trending up (+18% impressions). Double down on this pillar next week."

Step 6e: Update State with Import Date

After successful import and analysis, update the state file:

Read ~/.claude/linkedin-thought-leadership.local.md
Set last_import_date to today (YYYY-MM-DD)
Set last_import_week to current ISO week (YYYY-WXX)
Write the updated state file

Step 7: Next Steps

Present next steps using AskUserQuestion based on the analysis results:

If data shows declining engagement (current < baseline by >15%):

  • "Run /linkedin:report for full weekly breakdown"
  • "Run content audit to review strategy"
  • "Analyze your top post to understand what worked"

If data shows strong performance (current > baseline by >15%):

  • "Run /linkedin:report for the full numbers"
  • "Create more content in your top format"
  • "Draft your next post while insights are fresh"

If first import:

  • "Run /linkedin:report for your first performance report"
  • "Import 2-3 more weeks for trend analysis"
  • "Tip: Export weekly every Monday for best tracking"

If mixed results:

  • "Run /linkedin:report for complete breakdown"
  • "Review trend analysis for diverging metrics"
  • "Check which formats and topics drove results"

Present using AskUserQuestion with the top 3 most relevant suggestions.

Step 8: Demographics Sync Suggestion

After completing the import workflow, check if assets/audience-insights/demographics.md still has placeholder data:

grep -c '\[Industry name\]\|\[Function\]\|\[Country\]\|\[X\]%' ${CLAUDE_PLUGIN_ROOT}/assets/audience-insights/demographics.md 2>/dev/null

If placeholder count is > 10 (still mostly unfilled), suggest:

"While you're in LinkedIn Analytics exporting CSV data, you can also capture your audience demographics. Run /linkedin:setup and choose option 5 (Demographics) to fill in your audience insights with real data."

Error Handling

If the import fails:

  1. Check the CSV format - LinkedIn sometimes changes export format
  2. Verify the file path - Ensure the file is in assets/analytics/exports/
  3. Check file permissions - The CLI needs read access
  4. Show the error message and suggest solutions

Common errors:

  • File not found: Check the filename (case-sensitive)
  • Invalid CSV format: Verify this is a LinkedIn analytics export
  • Permission denied: Check file permissions with ls -l

Reference Files

The import system creates:

  • assets/analytics/posts/YYYY-WXX.json - Weekly post data
  • assets/analytics/metadata.json - Import tracking and baseline metrics
  • assets/analytics/baselines.json - Statistical baselines for anomaly detection

State Tracking

After import, the system automatically:

  • Updates baseline metrics (mean, median, std dev for each metric)
  • Detects and flags anomalies (posts >2 sigma from baseline)
  • Organizes posts by ISO week for trend analysis
  • Preserves historical data (never overwrites existing weeks)
  • Updates last_import_date and last_import_week in state file