Jump to content

Is poor judging an issue in UIL?


Recommended Posts

If you ever take a look at UIL Area and even state spreadsheets, you'll often see a LOT of very ridiculous spreads. I know that judges can see different things depending on where they are and what they focus on, and I don't want to bash any judges, but I feel like this is an increasing issue with large spreads in ordinals becoming ever more common.

 

Attached are judges sheets from prelims of 6A Area H, 5A Area E, 4A Area C, and 3A Area C all on their most recent state years. 

 

3A Area C marching judges just really could not agree, could be different views on judging but the spreads got bad in some places, 4A Area C just had INSANE spreads, like 4 23 7 2 12, and 19 24 6 10 6

 

5A Area E music judges were just all over the place, judge 2 ranked several bands that made finals really low in prelims, 6A I could definitely find a better example for but it has it's spreads, albeit much more reasonable and common ones.

 

https://imgur.com/a/5rewfrf

 

If I could give any thoughts as to why this is common, I feel like it's a lack of clarification on the judging sheets, most categories are rather vague and subjective to the judge, the few ideas of GE and difficulty the sheets present aren't written out very well, such as "General Appearance" and "Visual Reinforcement of Music" which I would take to be the general look of the group at a glance and the drill fitting the music, but those ideas can be defined differently. 

Link to comment
Share on other sites

Just in time for UIL Region and Area, as usual! :)

 

The short answer is yes. But as others will be quick to note, you can find ridiculous discrepancies in most other circuits, if you look hard enough. BOA usually fares a little bit better, but part of that is because BOA has only got 2 judges scoring the same thing on any one panel (Music General Effect). I do find it odd when individual performance scores are vastly different from ensemble performance scores, which happens pretty regularly in BOA.

 

How to fix it? Better training is probably the most obvious solution, but, if I'm letting my hair down, I think being more selective about who is allowed to judge is a better one. Unfortunately, to my knowledge, UIL does not appear to have a s***list. They should hit me up!

do you think the more consistent scoring in BOA has anything to do having a set of judges that they rely on for their events - many of which are also BOA competing directors?

Link to comment
Share on other sites

I was recently looking back at the prelims results for San Antonio last year and noticed that a large part of the reason that Keller jumped from 14th to 9th was because of their inconsistent GE scoring. In prelims they were ranked 23rd by one music GE judge and 4th by the other. They ended up 15th overall in GE and only made it into finals barely because they were 10th and 11th in music and visual.

Link to comment
Share on other sites

Judging always seems to have a judge or two that are on a wildly different page from the others. I am not sure how they can fix it without maybe doing something like they do in gymnastics where band programs start with a max top score and are marked down for each issue noticed. With a standard for each issue.

 

I personally think that would be a detriment to the programs as they would start to look very similar and we would lose variety and innovation. It is frustrating when bands lose places in finals because of "that" judge. We have all seen it, and I think there will always be a judge that views the shows different than the others. Heck, look at baseball umpires and their strike zones. They have a very controlled thing to watch for and they can't get balls and strikes right.

DCI used to do that. It was called the "tick" system. I'm sure some old marching band circuits did it that way too, but BOA has never as far as I know.
Link to comment
Share on other sites

DCI used to do that. It was called the "tick" system. I'm sure some old marching band circuits did it that way too, but BOA has never as far as I know.

The Cavalcade of Bands in the Mid Atlantic States and the New York Field Band Conference used to use this format. The problem was that many Band Directors observed that if you tried something new or challenging your were punished for things that were not IN VOGUE DURING THAT PERIOD. The fact is that judging is a difficult thing. Across the country there are different style of judging. In Ca BOA is not as well thought off as here, and that is because they have the SCSBOA, NCBA, MBOS, WBA, and several other circuits that each have their own judging standards and judges. Out there it is like a smorgasboard and often Bands that win in one circuit are judged lower in another. we can complain all we want but each human has different likes and dislikes.

Link to comment
Share on other sites

If you ever take a look at UIL Area and even state spreadsheets, you'll often see a LOT of very ridiculous spreads. I know that judges can see different things depending on where they are and what they focus on, and I don't want to bash any judges, but I feel like this is an increasing issue with large spreads in ordinals becoming ever more common.

 

Attached are judges sheets from prelims of 6A Area H, 5A Area E, 4A Area C, and 3A Area C all on their most recent state years. 

 

3A Area C marching judges just really could not agree, could be different views on judging but the spreads got bad in some places, 4A Area C just had INSANE spreads, like 4 23 7 2 12, and 19 24 6 10 6

 

5A Area E music judges were just all over the place, judge 2 ranked several bands that made finals really low in prelims, 6A I could definitely find a better example for but it has it's spreads, albeit much more reasonable and common ones.

 

https://imgur.com/a/5rewfrf

 

If I could give any thoughts as to why this is common, I feel like it's a lack of clarification on the judging sheets, most categories are rather vague and subjective to the judge, the few ideas of GE and difficulty the sheets present aren't written out very well, such as "General Appearance" and "Visual Reinforcement of Music" which I would take to be the general look of the group at a glance and the drill fitting the music, but those ideas can be defined differently. 

Where can one find the archived UIL spreadsheets?  I have been looking for days with no luck.  I am a number cruncher and find their scoring intriguing and erratic at the same time.  

Link to comment
Share on other sites

do you think the more consistent scoring in BOA has anything to do having a set of judges that they rely on for their events - many of which are also BOA competing directors?

 

First of all, I want to say that there are high quality people judging both BOA and UIL events. That said, BOA does seem to attract people whose pretty much only job in the marching arts is to judge bands, drum corps, and winter guards. Sure, you've got your Jay Webbs and Jarrett Lipmans, but a lot of these people are not band directors, or haven't been band directors in a long time. It's not uncommon for Dan Potter or Chuck Henson to announce a judge like: "He has been in the marching arts for 35 years. He has judged for BOA, DCI, and WGI world championships. He is the head judge coordinator for the New Zealand Winter Guard Association. He is an accountant. Judging visual performance - ensemble, from New Brunswick, New Jersey, Mr. Michael McFluff!" I think one of the good things about having career judges is that they judge A LOT, and they do so in circuits that are similar to BOA in terms of judging criteria, like DCI and WGI. So, there's a lot of reinforcement.

 

UIL judges, on the other hand, are typically Texas directors who don't do much judging. They'll put in their ten hours or whatever to judge a UIL area contest, and that's that. I'm generalizing, obviously, but this honestly describes a lot of them. I think the result of this is that personal criteria seep into the scoring a bit more than in BOA. How else do you explain results like SFA's UIL State result in 2004? SFA ended up 22nd place out of 31 bands, after one of the music judges had them in dead last. I think the judge who had SFA last probably placed a lot more emphasis on tone quality than the one music judge who had them in finals. SFA put on a very rhythmically accurate, passionate performance, but they also blew past the point of good tone quality, producing a pretty consistently strident sound. And that one UIL judge hit them HARD for it. Meanwhile, BOA music judges loved SFA all year. They ended up 4th place at BOA Grand Nationals and won the Music Performance caption. I didn't agree with that, but at least BOA was pretty consistent!

Link to comment
Share on other sites

First of all, I want to say that there are high quality people judging both BOA and UIL events. That said, BOA does seem to attract people whose pretty much only job in the marching arts is to judge bands, drum corps, and winter guards. Sure, you've got your Jay Webbs and Jarrett Lipmans, but a lot of these people are not band directors, or haven't been band directors in a long time. It's not uncommon for Dan Potter or Chuck Henson to announce a judge like: "He has been in the marching arts for 35 years. He has judged for BOA, DCI, and WGI world championships. He is the head judge coordinator for the New Zealand Winter Guard Association. He is an accountant. Judging visual performance - ensemble, from New Brunswick, New Jersey, Mr. Michael McFluff!" I think one of the good things about having career judges is that they judge A LOT, and they do so in circuits that are similar to BOA in terms of judging criteria, like DCI and WGI. So, there's a lot of reinforcement.

 

UIL judges, on the other hand, are typically Texas directors who don't do much judging. They'll put in their ten hours or whatever to judge a UIL area contest, and that's that. I'm generalizing, obviously, but this honestly describes a lot of them. I think the result of this is that personal criteria seep into the scoring a bit more than in BOA. How else do you explain results like SFA's UIL State result in 2004? SFA ended up 22nd place out of 31 bands, after one of the music judges had them in dead last. I think the judge who had SFA last probably placed a lot more emphasis on tone quality than the one music judge who had them in finals. SFA put on a very rhythmically accurate, passionate performance, but they also blew past the point of good tone quality, producing a pretty consistently strident sound. And that one UIL judge hit them HARD for it. Meanwhile, BOA music judges loved SFA all year. They ended up 4th place at BOA Grand Nationals and won the Music Performance caption. I didn't agree with that, but at least BOA was pretty consistent!

Interesting! But also a bit alarming. For UIL anyway. Also, do they pretty much use the same judges every year or do they get new ones? I’m not sure which scenario would be better.

Link to comment
Share on other sites

In our prelims run at Area, our music scores were 1, 5, and 10. The judge that gave us the 10 was the same judge who gave us low scores in 2014 and 2016, and resulted in us missing state. Just something to add to the table.

Feel fortunate the same judge had CyFalls 18th in music while another judge had Falls 4th in music.
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...