How PourPlan Ratings Work
Transparent, data-driven ratings to help you plan better wine tasting experiences.
Rating Philosophy
We believe great wine tasting experiences come from more than just great wine. The fee structure, hospitality, atmosphere, and even parking all shape your visit. Our ratings evaluate wineries across 7 specific dimensions so you can make informed decisions based on what matters to you.
Every winery gets a score from 1-10 (higher is better) based on a weighted average of these dimensions. We're transparent about our methodology, data sources, and confidence levels.
The 7 Dimensions
Each dimension contributes to the overall rating with a specific weight based on its impact on the tasting experience.
Tasting Fee
20% weightWhat we evaluate: Fee amount relative to the region, refundability, waiver conditions, number of pours, pour size.
How we score (1-10 scale)
Base score (relative to regional average):
- 10: Free tasting
- 9: Well below regional average
- 8: Below regional average
- 7: At regional average
- 6: Above regional average
- 5: Well above regional average
- 4 or below: Significantly above regional average
Regional context matters: A $25 fee in Woodinville (where the average is ~$25-30) scores differently than in a region where $15 is typical.
Adjustments:
- Refundable with purchase: +1.5 points
- Generous pour size: +0.5 points
- 6+ pours included: +0.5 points
- Small pour size: -0.5 points
- 3 or fewer pours: -0.5 points
- Pressure to buy: -2.0 points
Example: A $25 fee in Woodinville (base 7.0 at regional average) that's refundable (+1.5) with 5 generous pours (+0.5) = 9.0/10
Wine Value
20% weightWhat we evaluate: Price points relative to quality, professional critic ratings, visitor perceptions of value.
How we score (1-10 scale)
9-10: Outstanding value — High-quality wines (90+ critic scores) at fair prices ($25-45), or exceptional quality-to-price ratio
7-8: Good value — Quality wines (85-89 points) at reasonable prices ($35-55), or very good wines at mid-range
5-6: Fair value — Decent wines at market prices, or good wines priced slightly high ($55-75)
3-4: Below average value — Average wines at premium prices, or good wines significantly overpriced ($75+)
1-2: Poor value — Mediocre wines at high prices, or quality doesn't justify cost
Professional ratings (Wine Spectator, Wine Enthusiast) carry the most weight, followed by in-person assessments, then visitor reviews.
Hospitality
15% weightWhat we evaluate: Staff friendliness, knowledge, attentiveness, and overall service quality.
How we score (1-10 scale)
9-10: Exceptional — Staff goes above and beyond, highly knowledgeable, warm and engaging, makes you feel like a VIP
7-8: Very good — Friendly and attentive, knowledgeable about wines, welcoming atmosphere, no pressure
5-6: Adequate — Professional but not particularly warm, basic wine knowledge, serviceable but unmemorable
3-4: Below average — Rushed, inattentive, limited knowledge, makes you feel unwelcome or pressured
1-2: Poor — Rude, disinterested, or overtly sales-focused, actively detracts from experience
In-person observations weighted at 70%, visitor sentiment at 30%.
Wine Quality
15% weightWhat we evaluate: Professional critic scores, awards, tasting impressions, visitor feedback.
How we score (1-10 scale)
9-10: Outstanding — Consistent 93+ point scores from tier 1 critics, multiple awards, exceptional complexity and balance
7-8: Very good — Regular 88-92 point scores, well-crafted wines with distinct character and quality craftsmanship
5-6: Good — Some 85-87 point scores, solid wines that are well-made but not exceptional
3-4: Below average — Few professional scores or low ratings, noticeable flaws, limited complexity
1-2: Poor — No professional recognition, significant quality issues, poorly made wines
How we tier professional ratings:
Tier 1 (Highest confidence): Internationally recognized wine publications and critics with rigorous tasting protocols
Tier 2 (Medium confidence): Regional publications, established wine writers, reputable trade publications
Tier 3 (Lower confidence): Local competitions, small regional awards, self-reported scores
Not all 94-point scores are equal. A score from a tier 1 international publication carries more weight and higher confidence than the same score from a local competition. Professional ratings are primary; in-person tastings and visitor reviews provide supporting context.
Vibe
15% weightWhat we evaluate: Atmosphere, ambiance, energy level, decor, and overall feeling of the space.
How we score (1-10 scale)
9-10: Exceptional — Distinct, memorable atmosphere (romantic, lively, cozy), thoughtful design, music and lighting enhance experience
7-8: Very good — Inviting and comfortable, cohesive aesthetic, pleasant ambiance that encourages lingering
5-6: Adequate — Clean and functional but generic, nothing special about the atmosphere
3-4: Below average — Uninviting or uncomfortable, poor lighting/music choices, feels like an afterthought
1-2: Poor — Unpleasant atmosphere, dingy or neglected space, actively detracts from tasting experience
In-person observations weighted at 70%, visitor descriptions at 30%.
Setting
10% weightWhat we evaluate: Physical environment, views, landscaping, outdoor spaces, architecture.
How we score (1-10 scale)
9-10: Stunning — Vineyard views, water views, or mountain vistas; beautifully landscaped grounds; architecturally distinctive building
7-8: Very attractive — Pleasant outdoor space (patio, deck, gardens); mature trees or landscaping; appealing architecture
5-6: Adequate — Basic outdoor area or no outdoor space but nice building; some landscaping; functional but not scenic
3-4: Below average — Industrial/warehouse setting with minimal landscaping; no outdoor space; uninspiring architecture
1-2: Poor — Unappealing location (strip mall, industrial park); no redeeming visual features; neglected exterior
100% based on in-person observation. Research cannot assess what you'll see.
Parking
5% weightWhat we evaluate: Parking availability, ease of access, lot size, street parking proximity.
How we score (1-10 scale)
9-10: Excellent — Large dedicated lot with ample spaces, easy entry/exit, clearly marked, always available
7-8: Good — Dedicated lot or close street parking, usually available, reasonably easy access
5-6: Adequate — Street parking nearby, may require short walk, limited spaces during busy times
3-4: Challenging — Limited street parking, often full, requires circling or walking several blocks
1-2: Poor — No nearby parking, extremely difficult access, major deterrent to visiting
100% based on in-person observation. Research cannot reliably assess parking.
Two Types of Ratings
Research-Based Ratings
For most wineries, we analyze public information including:
- Visitor reviews - Sentiment analysis, theme extraction, specific mentions from public review platforms
- Professional wine ratings - Scores from Wine Spectator, Wine Enthusiast, critic reviews
- Tasting fee information - Fee policies from winery websites
- Wine selection analysis - Portfolio depth, varietals, pricing from publicly available sources
This gives us broad coverage (112 wineries rated) while maintaining consistency.
Coverage
112 wineries rated
Research-based scores provide consistent evaluation across the region
In-Person Manual Reviews
For select wineries, we conduct verified in-person visits where we:
- Taste the wines and assess quality firsthand
- Verify tasting fees and waiver policies at the source
- Document the space with 8 photo categories (parking, entrance, tasting room, patio, etc.)
- Experience hospitality and assess staff knowledge
- Observe setting and vibe that research can't capture
Manual reviews appear first on winery pages with a gold background as the authoritative source.
Verified Visits
4 wineries visited
In-person reviews provide verified, first-hand data and photos
How We Blend Research and Manual Reviews
When we visit a winery in person, we don't discard the research data—we blend it intelligently based on dimension-specific strategies:
Professional Ratings Available
Wine Value & Quality: 85-95% research (professional ratings), 5-15% manual verification
Professional critics provide objective, consistent wine scoring that in-person visits validate.
No Professional Ratings
Wine Value & Quality: 70% manual review, 30% visitor sentiment from public reviews
Without professional scores, in-person assessment becomes the primary source.
Subjective Dimensions
Hospitality & Vibe: 70% manual review, 30% visitor sentiment
Personal experience matters most, but we check for outliers against visitor patterns.
Observable Only
Setting & Parking: 100% manual review (authoritative)
Research cannot assess what you'll see or where you'll park. Manual reviews are definitive.
Confidence Levels & Transparency
Not all ratings are created equal. We show confidence levels for each dimension:
Based on manual review or abundant data points from multiple reliable sources, including professional ratings
Based on moderate amount of publicly available data or limited professional ratings
Based on limited data points or information from non-local sources
Every winery detail page shows you:
- The overall aggregate score and individual dimension scores
- Whether it's research-based, manually reviewed, or a blend of both
- The specific rationale for each dimension (not just a number)
- Confidence levels for each dimension
- Data sources and when the rating was last updated
Frequently Asked Questions
No. PourPlan does not accept payment from wineries for inclusion, ratings, or placement. All ratings are based on public data analysis and independent in-person visits.
Both research-based and blended ratings are updated at least monthly as new visitor reviews and data become available. When a winery has a manual review, the blended score incorporates fresh research data each month—so even without a new in-person visit, the rating stays current. We revisit wineries for new manual reviews as needed to verify changes or re-evaluate dimensions like Setting and Parking.
Yes. If a winery believes our data is inaccurate (e.g., outdated tasting fee, incorrect hours), they can contact us at [email protected]. We verify updates with public sources or schedule a visit to re-evaluate.
We're actively expanding coverage. Wineries without ratings typically have insufficient public data (very few reviews, no professional scores, limited web presence). We prioritize manual visits for high-traffic wineries first.
Manual reviews follow a structured rubric with specific criteria for each dimension. We blend manual scores with research data to check for outliers. If a manual score differs significantly from visitor sentiment (3+ points), we use a 50/50 blend instead of 70/30.
Ratings are subjective and reflect aggregate data, not any single visit. If you have feedback, we'd love to hear it at [email protected]. Your input helps us refine our methodology and identify wineries for manual re-visits.
Ready to Explore Rated Wineries?
Browse 112 rated wineries and plan your best tasting day yet.
Browse Rated Wineries