Labels & A&R | 12 min read
Close to 100,000 tracks are uploaded to streaming platforms every day. More than 200,000 when you count every platform combined. About 12 million creators publish music worldwide, and only 1% of them account for 90% of streams. Somewhere in the remaining 99% are the artists who will define the next five years of your roster. Finding them is only half the problem. The harder part is separating the artists who will sustain a career from the ones who will stall after their first signing advance runs out.
The scouting challenge has intensified into 2026. AI-generated tracks now flood streaming catalogs alongside human releases. Deezer alone receives 10,000 fully AI-generated tracks daily, roughly 10% of its new content. New fraud detection systems flag suspicious streaming spikes and unusual listening patterns. YouTube withdrew its streaming data from all Billboard U.S. and global charts in January 2026, altering how chart performance is measured and interpreted. The information landscape is shifting faster than ever, and A&R teams that rely on instinct alone will miss what the data reveals while teams that rely on data alone will miss what instinct catches.
Structured artist evaluation solves this. A scout report is a standardized document that captures what you observed, what the data shows, and what the opportunity looks like, all in a format that allows consistent comparison across every artist your team evaluates. Without one, scouting decisions drift toward gut reactions and recency bias. With one, you build an institutional record that sharpens your judgment over time and gives your entire A&R department a shared language for evaluating talent.
As Sony Music CEO Rob Stringer has stated, modern A&R requires both an artistic ear and analytical eyes. This guide covers how to build that evaluation framework: what data to pull, how to interpret it, what red flags to catch, and how to structure the report that turns raw scouting into an investment recommendation.
What Data Should You Pull for a Scout Report?
A scout report is only as useful as the data underneath it. The goal is not to collect every available number. It is to pull the specific metrics that reveal whether an artist can acquire an audience and, more importantly, retain one. Audience retention is the central challenge. Labels are effective at helping artists acquire listeners, but retention depends on the artist's ability to build genuine connection. Your scout report needs to distinguish between the two.
Streaming Metrics
Streaming data forms the quantitative backbone of any evaluation. Pull these from Spotify for Artists (if the artist shares access), or cross-platform analytics tools.
Monthly listener trajectory. A single monthly listener count is meaningless without context. What matters is the direction and consistency of the trend over the past 6 to 12 months. Is the audience growing organically release over release, or did a single viral moment create a spike that has since decayed? Steady, compounding growth is a stronger signal than a dramatic peak followed by a decline. Believe's Global Head of Talent Scouts, Camille Moussard, emphasizes that her A&R team pays close attention to how productive artists are on streaming platforms: audiences want fresh music released regularly, and release cadence is one of the most important criteria for identifying high-potential artists.
Save rate. The percentage of listeners who save a track to their library. This is one of the strongest indicators of genuine audience interest because it reflects a deliberate decision to return to the music. Context matters here: niche genres often produce higher save rates from smaller, more dedicated audiences. An improving trend across releases matters more than hitting a universal benchmark. That said, rates above 3% generally indicate healthy engagement, and rates above 5% suggest strong listener intent.
Completion rate. How often listeners play a track to the end. This metric is heavily influenced by song length and genre. A seven-minute ambient track will naturally show lower completion than a three-minute pop song. Compare similar-length songs within the artist's catalog rather than against external averages. What you are looking for is whether listeners stay through the track or skip early, and whether completion is improving with each release.
Skip rate. The inverse of completion. Lower is better, but "good" varies by genre and playlist context. A song placed in a discovery playlist will naturally see higher skips than one on a listener's personal playlist. Track the artist's skip rate across similar release types to identify the real trend.
Listener-to-follower conversion rate. The percentage of listeners who become followers. This reveals whether casual exposure translates into ongoing interest. Viral-driven artists may show lower conversion but higher volume. Niche artists often show higher conversion from smaller pools. A consistently improving conversion rate is a stronger signal than a high absolute number, because it suggests the artist is building sticky appeal rather than relying on algorithmic placement alone.
Playlist sources. Where is the listening coming from? Editorial placements from platform curators indicate industry recognition and can be volatile. Algorithmic placements (Discover Weekly, Release Radar) suggest the music is performing well in recommendation systems. User-generated playlist adds indicate organic audience activity. A healthy artist shows a mix, with growing algorithmic and user-generated share over time. Heavy dependence on a single editorial playlist is a risk factor because that support can disappear. Some specialized tools like Artist.Tools now track more than 100,000 playlists within the Spotify ecosystem and include bot detection features that help evaluate playlist credibility and identify artificial streaming.
Shazam data. Often overlooked, Shazam activity reveals real-world discovery. If people are hearing the music in physical spaces (clubs, shops, radio, live settings) and tagging it, that is a strong organic signal. Geographic Shazam data also helps identify cities where the artist has real traction, which informs touring potential and market prioritization. A&R consultants at major labels, including freelance scouts at Atlantic Records, routinely analyze Shazam data alongside streaming and social metrics when building artist reports.
Chart data (with a January 2026 caveat). YouTube withdrew its streaming data from all Billboard U.S. and global charts effective January 16, 2026, ending a partnership that dated back to 2013 for the Hot 100 and 2019 for the Billboard 200. The dispute centers on how Billboard weights subscription streams versus ad-supported streams (currently a 2.5:1 ratio, narrowed from 3:1). For A&R scouting, this means Billboard chart positions from January 2026 onward no longer reflect YouTube consumption. Artists whose audiences skew heavily toward YouTube (particularly in hip-hop, Latin music, and global pop) may appear weaker on Billboard charts than their actual listenership warrants. Factor this into any chart-based evaluation.
Social and Content Metrics
Social data reveals how an artist communicates, builds community, and converts attention into loyalty. These metrics are more qualitative than streaming data and require closer interpretation.
Engagement rate. Comments and shares carry more weight than likes. A like is passive. A comment reflects active participation. A share indicates someone is willing to associate their own identity with the artist's content. Calculate engagement rate as total interactions divided by followers, and compare it against norms for the artist's follower tier. Accounts with 10,000 followers should show higher engagement rates than accounts with 500,000. In 2026, A&R executives place greater weight on comments, repeat engagement, live draw, and fan signals than on raw streaming numbers. Surface-level metrics have become less reliable as success indicators.
Content consistency and identity. Is the artist posting regularly with a clear visual and tonal identity? Can you watch their content for 30 seconds and understand who they are, what they sound like, and why someone would follow them? Inconsistent posting, confused messaging, or a feed that feels like it belongs to five different people are warning signs. Strong content identity does not require polish. It requires clarity.
Audience authenticity. Look for signs that the audience is real. Genuine engagement includes comments that reference specific songs, lyrics, or personal experiences with the music. Bot-driven audiences show patterns: generic comments ("Great track!"), sudden follower spikes with no corresponding content event, follower counts that dramatically exceed streaming numbers, and geographic distribution that does not align with where the music is being played. Next-generation discovery platforms now build in fraud checks and highlight engagement quality instead of raw volume, giving scouts additional tools to verify authenticity.
Platform diversity. Is the artist building presence across multiple platforms, or is all their traction concentrated on one? An artist who has built audiences on TikTok, Instagram, YouTube, and Spotify simultaneously demonstrates broader appeal and reduces platform-dependency risk. An artist whose entire audience exists on a single platform is more vulnerable to algorithm changes.
Live Performance Data
Live data reveals something streaming metrics cannot: whether the artist can hold a room.
Venue capacity progression. Track the size of venues the artist has played over the past 12 to 18 months. Are they moving from 100-capacity rooms to 300-capacity rooms? Selling out consistently at each level before stepping up? Venue progression is one of the most reliable indicators of sustainable career growth because it reflects real, paying fans in physical spaces.
Ticket sale velocity. How quickly do shows sell? An artist who sells out a 200-cap room in two days has different momentum from one who takes six weeks. The speed of sellout reveals the intensity of fan demand.
Geographic concentration and spread. Where are the shows happening? An artist who can sell out in three or four cities has proven regional appeal. One who sells only in their hometown has a narrower base. Cross-reference this with Shazam and streaming geographic data to see if the live audience matches the digital one.
Merchandise revenue per head. If available, per-capita merchandise spend at shows indicates how deeply fans connect with the artist's brand. Strong merch numbers relative to ticket sales suggest an audience willing to invest beyond the music itself.
How Do You Evaluate What the Numbers Cannot Show?
Data tells you what is happening. It does not tell you why, and it cannot predict creative trajectory. The qualitative dimensions of a scout report are where A&R judgment matters most, and where experienced scouts separate themselves from analysts. As Chartmetric CCO Chaz Jenkins puts it, algorithms and automated tools cannot assess personal relationships very well, nor whether the artist manager is good, whether the agent genuinely believes in the artist, or whether the artist really wants to get involved in promotion. These are critical success factors, and they make the human touch indispensable.
Artistic Identity and Creative Direction
Distinctiveness. Can you describe this artist's sound and positioning in one sentence without referencing another artist? If the only way to explain them is "they sound like X meets Y," the identity may not be developed enough to sustain long-term career investment. The strongest signings have a clear sonic and visual identity that is recognizable within seconds. In 2026, with AI-generated music flooding platforms alongside human releases, distinctiveness has become even more critical. AI-powered tools like Warner's Sodatone can surface artists based on sonic profiling and mood-matching algorithms, but they cannot evaluate the intangible quality that makes an artist's voice genuinely their own.
Creative depth. Does the artist have a catalog that suggests range, or are they a one-song artist? Look at the quality and consistency across their last five to ten releases. One strong single surrounded by forgettable tracks is a different proposition from an artist whose last six releases all demonstrate improving craft. Development deals require creative depth. If the artist cannot sustain quality across a full album cycle, the investment risk increases substantially.
Artistic vision and self-awareness. In conversation, can the artist articulate who they are, where they want to go, and what kind of career they want to build? Nick Graf, former SVP of A&R at Interscope Records, identifies three categories of artists: those creating art for art's sake, those focused on commercial success, and those who balance both. Each category is valid, but the label needs to understand which one they are signing. An artist who cannot articulate their own vision will be harder to develop and harder to advocate for internally.
Team and Infrastructure
Existing team quality. Who is already around the artist? A strong manager, a competent booking agent, or an engaged publisher suggests the artist has attracted professionals willing to invest their own time and reputation. An artist with no team at all is not necessarily a red flag, but it increases the development burden on the label.
Business literacy. Does the artist understand the basics of how their career generates revenue? Do they know what their streams are worth, how their publishing works, how their live income breaks down? Business-literate artists make better partners because they understand what the label is investing and what they are expected to deliver.
Willingness to collaborate. A&R works best as a creative partnership. Some artists thrive with hands-on A&R involvement: connecting them with producers, songwriters, and engineers. Others need space to create independently. Both approaches work, but the label needs to know which type of artist they are signing. An artist who resists all creative input but also cannot self-direct will struggle in any label environment.
Market Timing and Competitive Positioning
Genre and trend alignment. Is the artist riding a wave that is cresting, or are they positioned in a space with room to grow? Signing at the peak of a trend means competing with every other label that noticed the same thing. Signing ahead of a trend requires more conviction but offers greater upside. Believe's A&R team structures scouting by territory and genre for this reason. Each scout focuses on one market and a few genres, building deep local expertise that allows them to spot momentum before it shows up in cross-platform dashboards.
Roster fit. Does this artist fill a gap on your roster, or do they compete with artists you already have? Major labels typically maintain 5 to 10 true superstars but may have 60 or more artists signed across various deal types. That creates intense internal competition for resources and attention. Signing an artist who directly competes with an existing priority act sets both up for resource conflicts.
Competitive landscape. Are other labels looking at this artist? Competitive interest can validate your assessment, but it also increases the cost and urgency of a signing. Document what you know about other label interest so leadership can factor it into deal structure and timeline.
What Are the Red Flags That Should Stop a Signing?
Red flags do not always mean "pass." Some are manageable. Others are deal-breakers. The purpose of documenting them in a scout report is to ensure they are visible and weighed before a financial commitment is made.
Purchased or artificial metrics. Follower-to-stream ratios that are dramatically out of proportion, engagement rates well below norms for the artist's follower count, comment quality that suggests bot activity, and geographic distribution that does not align with streaming or live performance data. Platforms actively detect and remove artificial streams, and artists with histories of metric inflation can face permanent algorithmic penalties. This is a serious red flag because it means the data you are evaluating may not reflect real demand. New fraud detection systems in 2026 are more sophisticated than ever, flagging suspicious streaming spikes and unusual listening patterns automatically. If the tools are catching anomalies, pay attention.
Declining trajectory with no explanation. An artist whose numbers are moving in the wrong direction needs context. A temporary dip after a release cycle is normal. A sustained 6-to-12-month decline with no strategic explanation (the artist took time off, changed genre, relocated) suggests the audience is leaving. Signing into a decline requires a credible plan for reversal.
No artistic identity. If the artist cannot articulate who they are, what they stand for, or why their music matters, the label will have to build that identity from scratch. That is possible but expensive, time-consuming, and risky. Development deals exist for this, but they carry higher risk and require significant A&R time investment.
Difficult professional reputation. Industry whispers about behavior, reliability, or interpersonal conflicts. This is sensitive territory and should be handled carefully, but it is relevant. A brilliant artist who cannot show up to sessions on time, alienates collaborators, or creates conflict with every team member will drain resources that could go to other roster priorities. Document what you hear, note the sources, and distinguish between isolated incidents and patterns.
Unrealistic expectations. Demands that do not match the artist's current level: advance expectations far above market for their metrics, creative control demands that prevent any label input, or timeline expectations that assume instant priority status on a roster with established acts. An honest conversation early can reveal whether expectations are adjustable or deeply held.
Single-source dependency. All traction driven by one playlist, one viral moment, one platform, or one geographic market. If that single source disappears, does the artist still have an audience? Concentration risk is manageable if the label has a plan to diversify, but it should be documented as a known vulnerability. The YouTube/Billboard chart separation reinforces this point: artists whose audiences were heavily YouTube-dependent now appear weaker on Billboard without any actual change in their listenership.
How Should You Structure the Scout Report?
A standardized format ensures every artist is evaluated against the same criteria and that reports are useful to people beyond the scout who wrote them. The following structure works for both quick initial assessments and deep-dive evaluations.
1. Executive Summary
One paragraph, no more than 100 words. State who the artist is, what the opportunity looks like, and your recommendation. This is the section that leadership reads first and sometimes the only section they read. Make it count.
2. Artist Profile
Basic identification: name, location, genre, age, management (if any), current distribution, publishing status. Include links to their primary streaming profiles, social accounts, and any notable press or features.
3. Metrics Snapshot
Present the key quantitative data with context. Do not just list numbers. Contextualize them against the artist's career stage, genre norms, and trajectory.
Metric Category | Data Point | Trend | Context |
Monthly listeners (Spotify) | [Number] | [Direction over 6-12 months] | [Organic vs. playlist-driven] |
Save rate (latest release) | [Percentage] | [Improving/stable/declining] | [Genre benchmark] |
Follower-to-listener ratio | [Ratio] | [Direction] | [Conversion health] |
Social engagement rate | [Percentage] | [Direction] | [Platform, follower tier norms] |
Live capacity | [Venue size] | [Progression] | [Markets, sell-through rate] |
Shazam activity | [Volume] | [Direction] | [Key cities] |
Release cadence | [Frequency] | [Consistent/irregular] | [Audience expectation alignment] |
4. Qualitative Assessment
Cover the dimensions that data cannot capture: artistic identity and distinctiveness, creative depth across catalog, vision and self-awareness, team quality and business literacy, market positioning and timing. Write in prose, not bullet points. This section is where the scout's experience and judgment matter most.
5. Strengths
What is working? What gives this artist upside? Be specific. "Good music" is not a strength. "Three consecutive releases with improving save rates and completion rates, suggesting growing craft and audience retention" is a strength.
6. Concerns
What gives you pause? Document every risk, whether it is manageable or not. This section protects the organization by ensuring that no signing decision is made without full awareness of the known vulnerabilities.
7. Competitive Intelligence
What do you know about other label interest? What deal structures might be in play? What is the likely timeline before this artist signs somewhere?
8. Recommendation
Three outcomes: pass, watch, or pursue.
Pass means the artist does not meet the threshold for investment at this time. Briefly explain why.
Watch means there is potential but the timing, data, or risk profile does not yet justify an approach. Specify what would need to change and set a date to re-evaluate. This is the most valuable recommendation in scouting because it builds a pipeline of artists you can move on quickly when the moment is right.
Pursue means you recommend approaching the artist. Include a suggested deal framework (development, distribution, joint venture, traditional), an estimated advance range, and the strategic rationale for how this artist fits the roster.
Your Next Step
Build your standardized scout report template using the framework above. Use it consistently for the next 30 evaluations, then review the reports collectively. You will see patterns in what you look for, what you miss, and where your instincts diverge from the data. That is where your scouting gets sharper.
Frequently Asked Questions
What is a scout report in the music industry?
A scout report is a standardized document that A&R professionals use to evaluate and compare artists for potential signing. It captures quantitative data (streaming metrics, social engagement, live performance numbers), qualitative assessment (artistic identity, creative depth, team quality), identified risks, and a clear recommendation to pass, watch, or pursue. The report creates an institutional record that allows A&R teams to track prospects over time and make consistent, data-informed investment decisions rather than relying solely on gut instinct.
What streaming metrics matter most for A&R scouting?
The most revealing streaming metrics for A&R evaluation are save rate (indicating genuine listener intent to return), listener-to-follower conversion rate (showing whether exposure translates to ongoing interest), completion rate (revealing whether listeners stay through the full track), release cadence (how consistently the artist delivers new music), and monthly listener trajectory over 6 to 12 months (showing sustained growth versus single-spike dependence). All metrics should be interpreted in context. Genre, song length, and career stage significantly affect what "good" looks like. An improving trend across consecutive releases is a stronger signal than any single number.
How do you identify fake or purchased metrics when scouting artists?
Purchased metrics produce recognizable patterns: follower counts dramatically higher than monthly streaming numbers, engagement rates well below industry averages for the artist's follower tier, generic or bot-like comments lacking specific song references, sudden follower spikes with no corresponding content or release event, and geographic audience distribution that does not match streaming or live performance data. In 2026, new fraud detection systems automatically flag suspicious streaming spikes and unusual listening patterns. Specialized tools like Artist.Tools include bot detection features for evaluating playlist credibility. Streaming platforms actively remove artificial streams, and artists with histories of metric inflation may face permanent algorithmic penalties. Document any authenticity concerns in the scout report.
Should A&R decisions be based on data or instinct?
Both. As Chartmetric CCO Chaz Jenkins puts it, A&R has always been about data. In earlier decades, that data came from attending live shows, reading the room, and knowing local listening trends. Today it comes from streaming analytics and social platforms, with 20,000 consumer data points generated per person per year compared to just two (a CD purchase and a concert ticket) twenty years ago. The role of instinct is interpreting what the data cannot capture: creative potential, artistic trajectory, emotional resonance, and the intangible quality that makes someone worth investing in. The scout report framework exists to ensure that instinct is informed by data rather than operating independently of it.
What is the difference between a "watch" and a "pursue" recommendation?
A "watch" recommendation means the artist shows potential but the timing, data, or risk profile does not yet justify an active approach. It should specify exactly what conditions would trigger a re-evaluation, such as "re-evaluate if next release maintains above 4% save rate and monthly listeners hold above 50,000 for three consecutive months." A "pursue" recommendation means the scout believes the artist is ready for an approach and includes a suggested deal structure, estimated advance range, and strategic rationale for roster fit. The watch category is the most valuable in scouting because it builds a pipeline of pre-evaluated artists you can act on quickly when conditions change.
Sources
OnesToWatch. "Chartmetric vs Music Intelligence Tools: Artist Discovery." January 2026. Comparison of Chartmetric, Viberate, Soundcharts, Artist.Tools, Luminate, and Spot On Track for A&R scouting in 2026, including bot detection features, cross-platform tracking capabilities, and the hybrid data-plus-human scouting approach.
OnesToWatch. "Affordable Chartmetric Alternatives for A&R Scouts 2026." February 2026. Confirmed 2026 pricing (Viberate $19.90/month, Soundcharts and Songstats under $50/month), hybrid scouting methodology, and the principle that successful A&R scouting in 2026 relies on a mix of affordable quantitative tools and qualitative human insight.
OnesToWatch. "A&R Discovery Tools That Help Indie Artists Get Signed." December 2025. Reports that A&R teams in 2026 place greater weight on comments, repeat engagement, live draw, and fan signals. Documents new fraud detection systems, AI-generated content oversaturation, and the shift from raw volume to engagement quality in artist evaluation.
Reprtoir. "AI as an A&R Assistant: Hype or Hidden Gem?" 2026. Coverage of Warner's Sodatone AI scouting platform, Berklee College of Music's findings on predictive analytics adoption, mood-matching algorithms for sonic profiling, and the ~100,000 tracks uploaded daily to streaming platforms.
YouTube Blog / Billboard / Variety. "YouTube Withdraws Streaming Data from Billboard Charts." December 2025 to January 2026. YouTube ceased providing streaming data to all Billboard U.S. and global charts effective January 16, 2026, over the ad-supported vs. subscription stream weighting dispute (2.5:1 ratio). Directly impacts how chart performance should be interpreted in scout reports.
Believe. "How A&Rs Use Data to Identify High Potential Artists." Interview with Camille Moussard, Global Head of Talent Scouts, on territory-focused scouting, release cadence as an evaluation criterion, and the use of internal distribution data across 100+ countries and 150+ digital stores to identify high-potential artists.
Water and Music. "How A&Rs Use Data to Scout and Evaluate Artists." Analysis of push vs. pull scouting methods, analytics tool watchlists, and the "cold start" problem. Cited data: approximately 12 million creators worldwide, 1% accounting for 90% of streams. Includes Chaz Jenkins interview on audience retention as the central A&R challenge.
Former Interscope Executive (Nick Graf) Interview. Insider perspective on major label A&R operations: three artist categories, roster dynamics (5-10 superstars vs. 60+ total signings), internal competition for resources, data-driven signings vs. development deals. AndR project knowledge.
analyti
