Posted on 5 Comments

Thoughts on the DOL YouthBuild 2012 SGA: Quirks, Lessons, and, as Always, Changes

YouthBuild season recently ended, at least for those of us lucky enough to be writing the proposals and preparing the application packages.

1. I’ve warned against the “Perils of Perfectionism” for grant writers, but it appears that RFP writers have also heeded this advice—too well. Page 23 of the original YouthBuild RFP* says, “These attachments will not count against the 15-page limitation for the Technical Proposal.” Page 26 says, “The chart and staffing plan should be included as Technical Proposal Attachments and do not count against the 15-page limitation of the Technical Proposal.” Yet the RFP says, in many other places, that the page limit for previous YouthBuild grantees is 20 pages and for new grantees it’s 25 pages. I sent an e-mail to Kia Mason, the contact person, and she (or he?) said, “Those are errors, the page limitation for previous YouthBuild applicants is 20 pages.”

Sweet!

There was another change that made sense: the original RFP requested that only county data be used in the needs assessment. A revision, however, allowed applicants to use city or other data. I imagine that the DOL got a lot of organizations saying things like, “We’re in L.A. county” or “We’re in Harris County,” along with 10 other organizations that will be forced to use the same data. And L.A. county contains everything from Beverly Hills to Compton to the city of Los Angeles itself.

2. I must give credit where it’s due: instead of playing hide-the-salami with data, as so many RFPs do,** YouthBuild this year simply told applicants where to find data and had applicants report uninterpreted data from a single source. This makes a huge, shocking amount of sense. I also suspect that the DOL got tired of the weird hodgepodge of data that they probably get from most applicants.

3. As long as we’re talking about data, I can also surmise that the DOL is implicitly encouraging applicants to massage data. For example, existing applicants have to report on the reports they’ve previously submitted to the DOL, and they get points for hitting various kinds of targets. In the “Placement in Education or Employment” target, “Applicants with placement rates of 89.51% or higher will receive 8 points for this subsection,” and for “Retention in Education or Employment,” Applicants with retention rates of 89.51% or higher will receive 8 points for this subsection.” Attaining these rates with a very difficult-to-reach population is, well, highly improbable.

That means a lot of previously funded applicants have also been. . . rather optimistic with their self-reported data. Still, those previously funded applicants’ haven’t necessarily been lying, per se. To understand why, let’s say that an organization is tracking a YouthBuild graduate and the organization finds that the graduate is working at McDonald’s. But she also worked on her Uncle’s deck for $30 last weekend. Is she employed? Is she employed in the construction industry? Or let’s say that a graduate reports that he’s enrolled in a community college. Do you call the community college and get the graduate to release his records, or do you take him at his word? Do you subtly encourage him to tell you he’s in school?

The cumulative weight of these micro decisions can have an enormous impact on the numbers that get submitted to the DOL. Some organizations are no doubt more diligent than others. We would never tell organizations to falsify data. But we do point out that not everyone interprets data claims the same way. The DOL implicitly rewards one kind of interpretation. Everyone knows there’s gambling at Rick’s in Casablanca. The official position is not always the right one, and it’s worth reading between the lines.

If you’re funded this year, you may want to remember this section when you’re filing your reports next year.

4. The RFP is structured in a strange way: the “Program Design” wants applicants to describe the training they’ll provide before the outreach, recruitment, and selection process. It would make more sense to structure the RFP in the order that participants will actually move through. Perhaps this is also symptomatic of the problems whoever wrote this RFP experienced in chopping up last year’s RFP to make this year’s.

5. The existence of YouthBuild is a testament to the power of zombie programs;*** graphs like the one in this post have proliferated and demonstrate that, not only is construction employment down, but it’s so far down that it’s at 1994’s level. This may be why the DOL will now let previously funded applicants offer alternative career paths. Still, training people in the construction industry right now doesn’t make a lot of sense, even by federal standards.

We also have pretty severe housing imbalances—there are too many housing units in places like Phoenix, Las Vegas, and the Inland Empire, and too few in places like Manhattan, Seattle, and San Francisco. The problem with the latter municipalities isn’t a matter of construction workers—it’s mostly a problem of municipal regulation, especially regarding height, density, and parking requirements. For more on this, see Edward Glaeser’s Triumph of the City, Matt Yglesias’s The Rent Is Too Damn High, Ryan Avent’s The Gated City, and Tom Vanderbilt’s Traffic. None of them will particularly help you write a YouthBuild proposal, but they will help you understand what’s going on.

6. Don’t be afraid of tautologies. You were warned against tautologies by your logic and writing teachers for a good reason, but you should disregard those warnings for a program like YouthBuild. There were a depressing number of questions like this one: “The applicant has an effective strategy to integrate all program elements, including the integration of community service and leadership activities supporting career exploration and occupational skill training.” The obvious answer—the program elements will be integrated by providing them together, rather than “in pieces”—is basically another way of saying, “Program elements will be integrated by being integrated.” Again: this doesn’t make a lot of sense, even by federal standards. The proposal can only be 20 or 25 pages, which doesn’t leave a lot of room for the repetition that DOL implicitly wants.

7. In keeping with the above, as usual, it was impossible to fully answer all the questions in 20 or 25 pages.

8. Page three of the RFP says: “Cost-Per-Participant: Cost-per-participant must fall in the range of $15,000 – 18,000 and the applicant must indicate the projected enrollment per year. The cost per participant should take into consideration the projected enrollment, leveraged funds and other resources supporting the program” (emphasis added). I wrote this to Kia Mason:

What does “take into consideration” mean in this context? Does that mean that YouthBuild wants a cost-per-participant that counts the entire match? For example, if an applicant requests $1,100,000 and gets the mandatory 25% of $275,000, the project total will be $1,375,000. Dividing that amount by $18,000 yields about 76, while dividing $1,100,000 by $18,000 yields 61. The SGA doesn’t offer any examples or further guidance about what this means.

He or she replied: “The cost per participant is derived from the federal amount requested only.” I imagine Kia got a ton of questions on this issue, since the phrase “should take into consideration” is so vague. I also imagine that a fair number of applicants didn’t inquire into the meaning of the phrase, and, consequently, the DOL will get half the proposals with one assumption and half with another.

9. There’s a particularly inane question under Section 1. d. Factor five: “The benefit of the participation of youth in occupational skills training within the selected industry(ies) that will be derived to the community.” The major “benefits” that the community might derive are at best nebulous. And they’re about the same for all communities: having people in jobs instead of jails, creating nominal tax payers, providing nominal low-income housing, and so forth. These benefits don’t change much from California to Connecticut.


* The Department of Labor prefers the term “SGA,” or “Solicitation for Grant Applications,” rather than RFP; we generally use RFP on this blog, rather than further confusing matters by applying the alphabet soup of acronyms that various federal and non-federal agencies use to describe the various ways they emit documents that will ultimately lead to the distribution of money.

My favorite recent example of acronym fever comes from the ACF’s Transitional Living Program and Maternity Group Homes: “The Family and Youth Services Bureau (FYSB) is accepting applications for the Transitional Living Program (TLP) and for Maternity Group Homes (MGH) funding opportunity announcement (FOA). TLPs provide an alternative to involving RHY in the law enforcement, child welfare, mental health, and juvenile justice systems.” I wonder if ACF also wants BBQ ASAP.

** See “RFP Lunacy and Answering Repetitive or Impossible Questions” for still more discussion on this issue.

*** Our post “Déjà vu All Over Again—Vacant Houses and What Not to Do About Them” also discusses elements of housing policy and how governments respond to housing issues.

Posted on 1 Comment

A Day in the Life of a Participant is Overrated: Focus on Data in the Neighborhood

I’ve seen a lot of proposals from clients and amateur grant writers that include something like, “A day in the life of Anthony” in their needs assessments. This is almost always a mistake, because almost anyone can include a hard-knocks anecdote, and they convey virtually no information about why your hard-knock area is different from Joe’s hard-knock area down the street, or on the other side of the tracks, or across the country. These stories are staples of newspaper accounts of hardship, but newspapers know most of their readers aren’t thinking critically about what they read and aren’t reading 100 similar stories over eight hours. Grant reviewers do.

Off the top of my head, I can’t think of any RFPs that requested day-in-the-life stories in the needs assessment. If funders wanted such stories, they’d ask for them. Since they don’t, or at best very rarely do, you should keep your needs assessment to business. And if you’re curious about how to get started, read “Writing Needs Assessments: How to Make It Seem Like the End of the World.” If you’re applying to any grant program, your application is one of many that could be funded, so you want to focus on the core purpose of the program you want to run. Giving a day in Anthony’s life isn’t going to accomplish this purpose.

Creativity is useful in many fields, but those involving government are seldom among them (as we wrote in “Never Think Outside the Box: Grant Writing is About Following the Recipe, not Creativity“). As a result, unless you see specific instructions to do otherwise, you should stick to something closer to the Project Nutria model, in which you describe the who, what, where, when, why, and how. Anything extraneous to answering those questions is wasting pages, and perhaps more importantly, wasting patience, and the precious attention that patience requires.

There’s only one plausible exception I can think of: sometimes writing about a day-in-the-life of a person receiving project services can be helpful, but again, you should probably leave those stories out unless the RFP specifically requests them. Some RFPs want a sample daily or weekly schedule, and that should suffice without a heroic story about Anthony overcoming life obstacles when he finally receives the wraparound supportive services he’s always wanted.

Posted on Leave a comment

Making it Easy to Understand Who’s Eligible for HRSA’s School-Based Health Center Capital Program

The Affordable Care Act has made it especially hard to figure out who’s eligible for a program. This week, the “Affordable Care Act: Grants for School-Based Health Center Capital Program” is our star. The announcement says that eligible applicants must “Be a school-based health center or a sponsoring facility of a school-based health center as defined in 4101(a)(6) of the Affordable Care Act [. . .]”

Once you track that section down, however, you find this: “(6) DEFINITIONS- In this subsection, the terms ‘school-based health center’ and ‘sponsoring facility’ have the meanings given those terms in section 2110(c)(9) of the Social Security Act (42 U.S.C. 1397jj(c)(9)).” So it’s off to find the Social Security Act. Eventually you’ll find:

In general.—The term “school-based health center” means a health clinic that—

(i) is located in or near a school facility of a school district or board or of an Indian tribe or tribal organization;

(ii) is organized through school, community, and health provider relationships;

(iii) is administered by a sponsoring facility;

(iv) provides through health professionals primary health services to children in accordance with State and local law, including laws relating to licensure and certification; and

(v) satisfies such other requirements as a State may establish for the operation of such a clinic.

Note that these five subsections describe only what school-based health center means “In general.” What would a specific definition be? Notice too that the act doesn’t mention whether this program applies only to K-12 schools, or if universities count. We’re choosing to interpret Local Education Agencies (LEAs) and their subsidiaries as eligible, since IHEs aren’t mentioned and “school district” generally implies LEAs. If you have any reason to think otherwise, let us know in the comments.

This isn’t the first time we’ve investigated curious, obfuscatory eligibility requirements, and I doubt it will be the last.

Posted on 3 Comments

Why does the Seliger Funding Report Sometimes Lack Key Data? Examples from the Small, Rural School Achievement Program and the Portable Assistance Program

Readers occasionally ask why our e-mail newsletter lacks key data about funding, like the maximum size of a grant or the amount of money available. We always have the same answer: we present whatever information we can find from the funding source. When we write “N.A.,” it’s not because we’re trying to hide data—it’s because we don’t have it.

Take, for example, the Small, Rural School Achievement Program, which you might have seen in last week’s newsletter. The Federal Register notice offers almost no information. The Grants.gov synopsis is little better. I read both, trying to find answers.

I didn’t find any, but I did learn that the Small, Rural School Achievement Program is offering formula grants, so the Department of Education may simply divide up the pie on a per-capita basis, or use some similar scheme. But they don’t tell you, so that’s just a reasonable guess on my part.

Other times, we get weird data and simply report what we find. Take the “Portable Assistance Program,” which offers funds to “provide services and/or develop small business assistance products that are centered on a replicable plan of action to increase small business success and viability in communities suffering economic hardship.” Aside from that description having a lot of words without a lot of content—what does a “replicable plan of action” mean? McDonald’s has one of those too, I think, and so do many science fiction monsters—the program has $1,080,542 available to make 11 grants of at most $100,000. If you can do elementary math, you’re probably wondering why they didn’t shoot for 10 grants or try to get a larger budget allocation. I don’t know. Someone in the bowels of the Small Business Administration might, but that person isn’t me.