Posted on Leave a comment

“Currently, [Census] data is not loading properly:” DOL’s YouthBuild FY ’21

Needs assessment experts and data nerds know that factfinder.census.gov, the old primary portal into Census data, is dead, while the new census data portal, data.census.gov, is only somewhat alive. Last year, I started a post about the ways that data.census.gov is broken, but I abandoned it because it was too boring, even for me; last year, data.census.gov was hellaciously slow, often taking 10 seconds for a query (a needs assessment may require dozens or hundreds of queries), and many internal links simply didn’t work. Some of that seems to have been fixed: back then, for example, trying to find specific sub-data sets, like educational attainment, for a given zip code, didn’t work. I sent some feedback to the Census contact person, who was very helpful, and eventually most of the problems disappeared.

But not all, it seems; this year’s DOL YouthBuild NOFA includes a humorous instruction regarding data requirements: pages 84 – 86 offer a 20-step algorithm for acquiring poverty data. That the algorithm has 20 steps and three pages is obviously bizarre: instruction 17 notes, “A table will come up showing the Total Population, the Number in Poverty, and the Poverty Rate. Currently, the data is not loading properly and at first only the overall U.S. data will load and you will not be able to scroll any further to the right to see anything else.” Oh? “Currently, the data is not loading properly:” that seems as if it could be the theme of the new Census interface.

About 10 years ago, there was a popular link-sharing site called Digg, and it introduced a now-notorious redesign that users hated, and those users consequently abandoned it en masse, leading to the rise of Reddit, a now-popular link-sharing site. If Digg had been more careful, it probably would have maintained its previous site design for those who wanted it, while introducing its new site design as a default, but not mandatory, experience. And then Digg would likely have iterated on the new design, figuring out what works. Reddit has somewhat learned this lesson; it now has two interfaces, one primarily living at old.reddit.com, which is maintained for people highly familiar with “the old Reddit,” and a newer one that is available by default at reddit.com. This bifurcation strategy allows a smooth transition between interfaces. The Census didn’t follow this strategy, and instead killed the old interface before the new one was really ready. Thus, bugs, like the bugs I’ve noticed, and bugs like those the Dept. of Labor noticed and mentioned specifically in YouthBuild NOFA. The more general lesson is fairly clear: be wary of big user interface changes. If you need Census data, though, you’ll have to use the interface, as is, since it’s the only one available.

For some reason—perhaps latent masochism?—Isaac continues to use MS Office 365 Outlook (not the free version) as an email client, instead of Apple’s Mail.app, or Thunderbird, and he tells me that every time he opens Outlook, he gets an invitation to try “the new Outlook” interface. So far, he’s resisted, but he also points out that most change is positive: when S + A started in 1993, there was effectively no commercial Internet, and the only way to get Census data was to go to Census Office, if you were near a big enough city, city hall, or a large library, where it was possible to thumb through the impenetrable Census books and maps. After a year or two in business, some vendor got the idea of putting the 1990 Census data on CDs (remember those), for quite a high price. Even though S + A was struggling to control costs, he bought the CDs, since they were better than hours in a Census Office or library. But then he had to buy, and install, CD drives in the Pentium PCs (remember those) we used. A couple of years later, he stumbled into a Census data portal set up by a random university, which worked! So, he tossed the CDs. When the 2000 Census came out, the feds essentially copied the university’s interface, creating factfinder.gov, and all was well until data.census.gov came alone. It’ll probably be better than the old interface, at some point.

Complaining is easy and making things better is hard. In the Internet era, both complainers and makers have been empowered, and I appreciate the difference between the two. People who have fundamental responsibility for a product, service, or organization, including the responsibility for making hard decisions that aren’t going to be popular with everyone, have a different perspective than those who can just complain and move on. So I don’t want to be a drive-by complainer, as so many are on “social” media, which seems poisonous to institutional formation and coherence. But, despite those caveats, the instruction from DOL regarding the Census being broken is perversely funny.

Posted on Leave a comment

Writing Needs Assessments: How to Make It Seem Like the End of the World

Almost every grant proposal requires some form of needs assessment. More or less, the sentiment one must get across it that “It’s the end of the world as we know it and I feel fine,” as REM says. Essentially, the object is to make problems look overwhelming, but solvable with just a dollop of grant funds. So, how does a grant writer do this?

Start by making the end appear nigh, which requires a needs assessment. Look at the Census data available at American Fact Finder, which has a variety of geographic choices (e.g. county, city, zip code, census tract, etc.). It is almost always best to match the project target area with a census data geographic area to make assembling data easier, regardless of whether the census area perfectly matches the area you want to serve. Try not to make the target area, “the Westside of Dubuque,” unless that happens to conform with four census tracts. Most geographic areas have 2000 Census data, as well as estimates for 2005. Pick the date that is to your advantage, and being to your advantage means making the situation look worse. For example, if incomes have been trending downward and unemployment upward due to plant closings, the 2005 data may be better. Announce that, if current trends continue, Dubuque may be abandoned completely in 2010 because there are too few jobs, but the situation can be improved with the requested grant.

Once you have your target area, find useful socioeconomic indicators like ethnic breakdown, median family income, age cohort percentages, percent of people living below poverty, percent with disabilities, etc. Only include data that supports your case. A winning grant proposal is not like a thesis, so you are under no obligation to use all available data. Also, it is critical that you provide some data on a larger area for comparison purposes, so your readers understand the relative problems. This can be the city, the county, state or even national data—pick whichever makes your situation look worst, meaning with the greatest discrepancy between the target area and the larger sample. It doesn’t really matter which geographic level you compare to, as long as you can say something to effect of, “The target area median family income is just 2/3 that of Los Angeles County.” Depending on the target population, it may be advantageous to compare data for a particular ethnic group to all residents. For example, if the target area includes a significant African American population with lower incomes, you can set up tables showing African American indicators versus white indicators for the same geographic area, in essence comparing the target area to itself. American Fact Finder has a handy tool on the left button bar for “Fact Sheet for a Race, Ethnic or Ancestry Group” that makes this easy to do.

FactFinder

(Click here to see the full image.)

You can also use census data to obfuscate the actual reality in the target area. For example, in many Southern California cities there are high percentages of Asian Americans, who in some communities have higher-than-average incomes. This can be used for statements such as, “over two in five residents is a person of color.” For better or worse, most grant reviewers will usually associate persons of color with lower incomes and higher risk factors whether this is true or not. Grant reviewers seldom have a deep background in statistics and they probably don’t even know statistics for journalists let alone real statistics. Even if they do, everything starts to become a haze after reading a dozen federal proposals that can be onerously long, so most reviewers are apt to begin looking more for conclusions than data not long into the process. Do you somehow fulfill the checkbox that asks whether educational attainment is lower in the target area than the nation? If so, give ’em five points and move on.

Other good sources of data include state and local departments of education. Some states and school districts have better data engines than others. For example, the California Department of Education has a great site, DataQuest, but other states’s data system are, as Borat would say, “not so much”. If a good data engine/warehouse is not available, find the school/district reports cards mandated by the federal “No Child Left Behind” legislation. Many districts try to hide these reports, as they are often unflattering after you get past the mission/vision statement platitudes, but if you dig hard enough you will find them. If necessary, call the statistics unit at the district or state and force the reports out of them.

Once you have data, only use what helps the argument. So, if test scores for certain grades are low relative to the county or state, use those, not all test scores. If you want to use dropout data, use the four-year derived rate, not the single year, which will be much lower. In some states, such as Illinois, drop out data is wildly understated, due to the way the state treats students who are no longer in school, so if you have to use it, underscore this fact. Health data, including disease incidence, mortality, etc., can usually be found at state and local health department web sites, while crime and gang data are typically found at police department web sites.

If you’re having difficulty building your argument with data, a good technique is to call local “experts” for quotes. For example, find and call the police unit responsible for gang suppression in your target area, then ask leading questions. Invariably, the officer will tell horror stories about rampant gang activity. Just ask if you can quote her and she will almost always agree. It’s always fun to include the names of some local gangs in your proposal for a dash of reader titillation. This is particularly important if the reader is on proposal 35 out of 40 and just wants to go find the hotel bar. You can also find the name of any large social service provider or city official in the target area (other than the one for whom you are working) and ask them about local problems with the target population. For example, if you seek information about at-risk youth services and you talk to the local Boys and Girls Club executive director or city parks director, this person will almost always say that new problems are erupting every minute while their funding is declining.

This gives you the opportunity to write something like, according to Conrad Cuttlebone, YMCA Director, “there are many more latch-key kids in the community since the Hindenburg Dirigible Factory closed, and we’re seeing many more cases of domestic violence, while at the same time the county cut our funding by 50%.” When all else fails, you can simply write, “although specific target area level data is not available, the agency knows anecdotally that teen pregnancy is on the rise, mirroring national trends.” Of course, you can do this even if the local area doesn’t match national trends, as most reviewers don’t have the vaguest idea about national trends for anything.

In other words, while it is not a good idea to make up data, it’s perfectly fair to exaggerate problems through obfuscation and specious analysis. You’re generally rewarded for such effects: the worse the target area, the more likely you are to get points, and the more likely you are to be funded.

The gentle art of writing needs assessments really comes down to painting word pictures that combine cherry-picked data with opinions and anecdotes strung together to meet the expectations of reviewers, who assume something terrible must be going on in your community, or you would be doing almost anything other than writing a grant proposal, such as watching my favorite college football team, the KU Jayhawks, trounce Virginia Tech in the upcoming Orange Bowl. Rock! Chalk! Jayhawk! KU!