“Best” Albums of the 2010s

I didn’t realize so many legacy media did decade-long lists.  A few also that I’d never heard of took big stabs.  I’m more interested in all your favorites than these and will be sure to engage if you share.  










Of all these, clearly Good Housekeeping Magazine has my ear https://www.goodhousekeeping.com/life/entertainment/g29859504/best-albums-from-the-2010s/

And a question for you all:  Do you prefer that these lists pretend to list objectively “the best” or “most essential”?  What’s wrong with saying they’re just a particular person or source’s “favorites” subjectively?


  • The “best” claim always strikes me as silly at best. First, it is always comparing apples and oranges (how exactly do you decide that a good hip hop album is “better” than, say, a good chamber music album?). Second, no-one listened to everything. Third, “best” most often has a rather suspiciously large overlap with “best-selling or most heavily promoted” in the mainstream outlets. Fourth, what was “best” of the decade is confined to a specific geographical and cultural space most of the time. Fifth, if you believe these lists, popular music albums are inevitably better than albums in any other genre. Sixth, and most importantly (:-)), the albums I like most never appear on the lists.
  • edited December 2019
    When I first thought about making a Spotify playlist of things I've (really) enjoyed this past year (Richmond Avant-Improv Collective - Multiplicity; Anthology of Contemporary Music From the African Continent [and other new releases since on the Unexplained Sounds Group label] and Walter Maioli & Nirodh Fortini Taraxacum), my first choices weren't even available thru them, so I kinda slumped at the chance,
    but there are some others that've made the grade since.

    I will say that Sampa the Great's new album getting #1 on the Bandcamp list is surprising and very gratifying!
  • edited December 2019
    I started a Spotify playlist of Chicago jazz-and-related stuff for the year, but a good half of the albums weren't there. 

    As to the Bandcamp list, I am similarly stoked to see Damon Locks at #3

    As to album of the decade, my most played (aside from a couple of 'morning' albums) is Tyshawn Sorey's 'Alloy', so that's that. 

    [My methodology doesn't account for years the album was available to play, so 'Pillars' might be the real winner. I'm comfortable staking out a position where Sorey is artist of the decade...With consideration to Mary Halvorson, who earns bonus points for her relentless recording output]
  • I am sidling toward my own year-end list, with no illusions that anyone else will relate to more than bits of it.
  • edited December 2019
    I'm with you on all of that. @Doofy
  • edited December 2019
    The piece at The Quietus that @rostasi linked on the other thread has a good, eloquent debinking of bestness, and also a succinct statement of what I think is the usefulness criterion and raison d'etre of these end of year lists: "if you spend some time with it you will come away having discovered new music that you love". For me the lists matter simply as another discovery engine that has the merit of being built on some degree of someone else having loved something.
  • Case in point: I had totally missed that Teeth of the Sea had a new album out - will have to listen to that.
  • That story he tells about his dad in that Quietus intro is pretty amazing.
    ...but I'd have to say that I've known people like that.
  • Gp:
     The “best” claim always strikes me as silly at best
    Although everything you wrote was reasonable, I'm surprised you wrote all of that without recognizing the fundamental fact that "best <art>" is an incoherent statement. Art "quality" is a judgment, not a characteristic, and so even vastly different assessments are completely consistent and indicate nothing in particular about the subject of assessment (except perhaps for statistical associations with human perception).

    That said mostly to prescribe substituting "best" with "most highly judged" and not worry about it. (Or, if you worry, worry about teaching people to appreciate the above, ha.)
  • Well I think the apples and oranges comment was meant in that spirit. But the issue remains, “most highly judged” by whom and against what unstated criteria? (The issue is not the art having no characteristics that could be compared, but which ones are selected for relevance when ascribing quality, I think, based on, oh, 5 minutes thinking about philosophy of art :-).) The claim that such-and-such an album is the most highly judged is as vulnerable to the false-consensus effect as the claim that it is the best. Everyone, including the list compilers, is judging from somewhere.
  • I meant personal judgment using personal criteria (that of course one often develops and shares to varying degrees with other individuals). Writers could attempt to ignore their judgment and gauge consensus in a "best-of" list, but they typically don't, and meta-lists do that well enough. Historical discussions are a different bag.

    Of course, I acknowledge that consensus on "quality" often exists, but that reflects shared physiologies, cultures, and histories of the judgers, and not independent characteristics of the art itself. I like this piece as a reminder: https://www.wired.com/2009/09/monkeymusic/

    Even more interestingly (or insidiously?) misguided, I think, are the terms "underrated" and "overrated". It's fun to try to define those in useful ways that don't pretend that some things _should_ be seen as good or not.

  • My understanding is that you have people, at whichever organization, independently list the albums that made the most impact on them. Maybe using a weighted numbering system, the results of all of the employees are tabulated and a final list is compiled. It seems that the main benefit from this is that it tells you more about the organization’s interests and whether the organization is one that aligns with your interests. I’m sure we can all relate to the idea of not really caring about what a particular magazine or website finds interesting musically and so you just naturally gravitate towards some organizations. If you like a site that is concerned about having their finger on the pulse of popular society, then you go there - or you can go somewhere a bit obscure if your tastes lean that way.
  • Yeah, that may describe lists like "large" (writer) entities like Pitchfork, but they're subject to hidden (un-explicated) editorial choices. Most "best" lists are single author. I love meta-lists like NPR's jazz poll, where their collation criteria are objective - those do end up at least reflecting some consensus of individuals, which at least in part reflects the common criteria engaged listeners use to judge.
  • edited December 2019
    Yes, you can become enamored with a particular writer as well,
    but the writers are often tied to a publication or special website.
    Yeah, the NPR jazz poll last year was surprisingly good, so I'm
    looking forward to that this year. That "year-end" link I provided
    shows a somewhat typically weighted slant toward what they are
    all about - of course. The site that talks a lot about metal will
    provide their special list of the "best" albums - which just happen
    to have a hard edge to nearly all of them.

    With The Wire, I do enjoy the "full" list as well as reading about
    what individual writers think were the glowing releases of the year.
    After this last list from The Quietus, I'm considering following them
    a little more closely.

    I suppose, too, someone could go over to a site like Rate Your Music or
    Best Ever Albums 
    and find someone that seems to lean in your particular
    direction and see what they found interesting (that you may have missed).
  • I certainly gravitate to some (producers of) lists more than others. And I find myself pretty indifferent to what criteria of quality they may have applied - I basically end up scanning lists for clues that something I have not heard might be the kind of thing I might want to check out. That’s typically no more than a couple of things per list. Sometimes it’s extremely scientific things like cover art that catch my interest :-).
  • edited December 2019

    Great points, all.  I think the “Monkey Music” article really resonates on a, well, primal level.  However, I think it’s dead wrong to expect an open-minded person not to like music that wasn’t written for them or, more accurately, their particular demographic.  By now, we all know basically what kind of music we do and don’t like (while hopefully being open to exceptions), and if a list doesn’t have any of the former or mainly highlights stuff we think is “overrated” (which I do think is a very useful term, provided one knows who’s using it), we toss the list in the unwarranted hype bin.

    The Quietus folks are pretty deep thinkers/listeners all right.  100 for one year is going to be a good sample and even better excluding much pop nonsense (or sliding consensus top 10 favorites elsewhere into the mushy middle, like Lana Del Rey, Billie Eilish).  Maybe most striking and concerning, though, is the stark contrast with more mainstream sources:  about as high a percentage of "urban" music on Quietus as experimental stuff on Pitchfork or AV Club.

    I’ll readily dismiss any “best of” list not obviously compiled by some consensus-building process unless a single, personal guru (like one of you all) is behind it.  Groupthink has to be marginally better than a single dictator unless s/he’s your favorite one.

    Does a rating of “essential” or “non-essential” do any better?  

    Indeed for any popular music list, “best” just puts “bestselling/most streamed” or even more incisively “most broadly appealing” and likely “most heavily promoted” into a more marketable and authoritative box.  Since those distinctions map onto empirical data far better than “quality,” all a chart of “hits” is doing is gussying up sales numbers.  And who would want to read that unless repackaged, however falsely, in a guise of bestness?  

    I think the Pitchfork list and a few others do try to throw in some non-popular music (i.e. Fennesz at #8 for 2019), but they feel like tokens or deliberately ostentatious “look at me and my breadth” entries, based on some unknown or usually unstated aesthetic principles that magically allow it to be rated relative to hip-hop and pop.

    I think “most highly judged” is useful to isolate the credibility of the source as the deciding factor, but it sure is awkward.  

    I’m going to stick to “Favorites” as both simple and honest.  And again, I do look forward to your lists; I’ve recognized little or nothing anyone has referenced so far.

Sign In or Register to comment.