Thursday, December 8, 2022

12 Questions that Police Leaders Should Ask About Their Crime Analysis Capabilities

Next week, I have the honor of attending the National Public Safety Partnership conference in Tulsa, Oklahoma  to work with a number of agencies experiencing high violent crime rates in their communities. I was asked by the Bureau of Justice Assistance and the Institute for Intergovernmental Research, the organizers of the conference, to revive a presentation that I had given a few years ago. Fellow consultant Julie Wartell joined me on the session and has helped refresh and supplement the original material with her advice and extensive experience.
   
Given that each agency attending the symposium has unique issues and Julie and I only have an hour to meet with each of them, I thought I'd offer a full version of the presentation here in case we don't have time to get to all the material. I also thought it might benefit agencies unable to attend the event.
      
     
If you don't feel like watching the video to see what the questions are, here's the list with a few notes:
  
  1. Is my crime analysis comprehensive? Your crime analysis unit should issue regular products on people, places, patterns, and problems.
  2. Do we have enough people? The IACA recommends 1 analyst per 1,500 UCR Part 1 crimes or 1 analyst per 2,800 NIBRS Group A crimes, or 1 analyst per 70 sworn officers. These calculations should all come out to roughly the same figure.
  3. Are they located in the right place? We recommend an organizational location at a level before the agency splits into its major divisions and bureaus, and a physical location in a central, accessible place to help maximize information-sharing and rapport.
  4. Are they allocated in the best way? In agencies with multiple analysts, do you separate them by geography, crime type, analysis type, or some other factor? There's no universal answer, but the best agencies achieve a balance between centralization and decentralization.
  5. Do they have access to the data they need? You know what? I'm just going to link to this other article.
  6. Do analysts have a rapport with the officers? Civilian analysts need to develop relationships with line personnel to facilitate the exchange of information and the use of analytical products. Various actions by leadership can help or hinder such rapport.
  7. Are the analysts' products actionable? Good crime analysis products lead directly to solutions. To do that, they can't just be a bunch of statistics, charts, and maps. They have to balance the qualitative and quantitative, consider both police and non-police data sources, and answer the questions of who, what, when, where, how, so what, and now what.
  8. Are the analysts' products used? If you expect actionable products from your analysts, you should expect action from the rest of the department. Good agencies have policies and programs to address repeat offenders, hot spots, short-term patterns, and long-term problems using a variety of mechanisms.
  9. Do analysts have a professional development plan? Crime analysis is constantly changing. Analysts need to keep up on the latest tools, technologies, ideas, and best practices, as well as the changing nature of crime itself. Good crime analysis units devote time every week to professional development.
  10. Do analysts participate in professional networks? Analysts learn from each other. For very little money, crime analysts can become a part of local and national professional associations that not only provide training and literature but let them share what they know with a broader community.
  11. Do we provide anything to partners and stakeholders? Don't forget the community. Most analytical products with only minor redactions could be sent out to the public. Help your residential and business community members prevent their own victimization with detailed, current products, not just the same old "crime prevention tips."
  12. Would crime analysts' outputs hold up to scrutiny? Make sure your analysts are following conventional processes for accessing and querying data and that their numbers and findings are sensible. Some analysts simply don't have the right training, resources, or, unfortunately, skill or interest. 
    
The signs of a good crime analysis capability include:

  • The agency receives regular analytical products on people, places, patterns, and problems.
  • The analysis unit has enough people to provide in-depth, accurate, actionable analysis of these issues.
  • The analysis unit is organizationally located in a place that allows it to serve the needs of the entire agency.
  • If there are multiple analysts, their workload is allocated in such a way that no major crimes, geographic areas, or operational divisions are going unserviced.
  • Analysts have ready access to a timely, complete, and accurate dataset on which they can conduct flexible queries, with no unnecessary technological or procedural obstacles.
  • Analysts have strong rapport with the operational personnel among them. Their work is trusted and used, and they receive regular intelligence and feedback.
  • Everything issued by the crime analysis unit is detailed, accurate, and precise enough to inform direct action.
  • Those products are actually used by operational divisions to create targeted tactics and strategies.
  • Analysts receive regular training and professional development and have a plan for such.
  • Analysts participate in global and regional associations and both follow and contribute to global standards.
    
I'd be glad to discuss any of these issues in more detail. 

Monday, September 19, 2022

Some Stories Have Happy Endings

A 1970s known offender card from the Cambridge (MA) Police Department
     
When I began my career as a crime analyst at the Cambridge Police Department in the early 1990s, it was just before the global transition to widespread desktop computing. The department, and the Crime Analysis Unit, was still awash in paper reports and records, some quite innovative. The above "known offender" card is an example. There were thousands of these cards lined up in a long drawer, each with information about a known robber, burglar, sexual assailant, or other repeat offender.
   
The edge of the card has a series of variables about each offender--race, sex, age range, type of crime, preferred victim, and so forth. (Races are given as White, Black, Oriental, and "P.R." Apparently, there was no "Latino" or "Hispanic" in the 1970s, just Puerto Rican.) Each attribute had a corresponding hole next to it, and you would punch out the distance between the hole and the edge of the card for the attribute corresponding with your offender. Then, if you had a series of burglaries involving a "Puerto Rican" male in his 20s and you wanted to find all offenders who matched that description, you would stick a long metal skewer through the "burglar" hole and lift up the stack of cards. All the burglars would fall out. Then, you'd repeat the exercise for males and Puerto Ricans and the age range until the only cards you had left were those that matched your filter. It was a physical version of the WHERE command in SQL.
    
We didn't actually use this system by the time I was working there. Most of the cards date from the 1970s. But I thought the system was so clever that I took a scan of one of the cards so that I could use it as an example of a pre-computer database. I've been using the image above in presentations for over 20 years. This particular known offender card was created in 1974 and concerns a man named Joseph, nicknamed "Jo Jo," then in his 20s. Jo Jo was a heroin addict who liked to break into apartments in the nighttime from fire escapes. He also had at least one attempted murder charge.
   
This afternoon, I spoke with Jo Jo.
   
It somehow took me 20 years of using his card before it occurred to me to Google his name and see what became of him. Google told me part of the story and supplied his phone number, and he told me the rest of it. After he hit rock bottom in Cambridge in the 1970s, he turned his life around, moved out of the Boston area, got into rehab, got a graduate degree, and became a substance abuse counselor. He worked in the field for decades before retiring in the 2010s. He is now in his 70s, and he was extremely amused hearing someone call him "Jo Jo" for the first time in nearly 50 years. 
  
We made plans for him  to speak to my CJ 101 class about crime and drugs. 
  
As a crime analyst and quasi-criminologist, I know that Jo Jo's story isn't unique. Lots of criminals age out of crime. I was as likely to find a positive story as I was a 1980s obituary. But sometimes it doesn't feel that way. It was a nice way to start the week.

Thursday, September 15, 2022

For @#$&'s Sake, Just Give Me the Data

 
With a full set of RMS data tables linked to my SQL-based querying application, there's no question I can't answer. Without it, there are few questions that I can answer.
       
Imagine being hired as a head chef for a new restaurant. You show up for your first day of work and find the kitchen full of pots and pans but no ingredients. When you ask the facilities manager where the food is, he tells you that it's all locked in a separate room, but if you tell him what you need, one ingredient at a time, he'll bring them out to you when he gets around to it. Or imagine you're hired to play the piano in a hotel lobby, only when you arrive for your first shift, you find that you're not actually allowed to touch the piano's 88 keys. Instead, you have to push six buttons, each of which presses a predetermined combination of keys. With this, you somehow have to create music and respond to visitors' requests for songs.
   
These precise scenarios affect crime analysts everywhere, daily, and they have for about as long as I can remember. I have consulted with police agencies for over 20 years, and by far the top issue that I have encountered when evaluating crime analysis units is the simple inability for analysts to access the agency's data at an appropriate level. Everywhere I go, I find analysts fighting with their IT staffs and records management system (RMS) vendors, making do with half-measures, and hand-entering data that already exists in databases into their own, separate, data systems because they can't get at the official data. These analysts often find no support from their agency's executives, who simply accept pathetic excuses from IT directors and RMS representatives for why what the analyst wants cannot be done. The advent of "cloud-based" systems has added a new level of difficulty, as the nature of such systems precludes some of the traditional mechanisms for connecting to data, and many cloud-based vendors haven't bothered to create alternative solutions.

This situation is absurd. It needs to stop. 

Let me be clear: For crime analysts to do their jobs, they need direct, timely access to the agency's call-for-service (CAD) and crime (RMS) data in its original relational format. The only asterisk I would put after this statement is that if the CAD or RMS needlessly stores data as a bunch of indecipherable codes, then analysts could use access to replications of those original tables with the codes translated. (But if such views don't exist, just give analysts the raw data. They'll figure it out.) Otherwise, any attempt by the IT staff or RMS vendor to provide special reporting tools, views, or custom reports that only access some of the data, and only in certain formats, will retard rather than enhance analysts' abilities to do their jobs.
           
ODBC has been around for 30 years. Stop saying you don't support it.
      
Any of the following solutions are acceptable:

  • A direct connection to the RMS and CAD databases. 
  • A direct connection to a replication of the RMS and CAD databases, updated at least once a day, with all the substantive data tables.
  • An export of all substantive data tables to a series of delimited or fixed-width text files, updated at least once a day.
    
Solutions that are not acceptable including anything that exists wholly within the RMS application. I need to be able to write complex queries across tables, synthesize data from multiple sources, and extract data to a variety of technologies. No RMS system, no matter how advanced, will ever do everything I need from within the system.

The number one question that any agency with a crime analyst needs to ask of its current or potential RMS vendor is: which of these three solutions do you support? If the answer is "none," leave them sputtering about their special customized crime analysis dashboard and walk away. 

Frankly, this should be all I have to say. It should be all any analyst has to say. Analysts are presumably hired for their expertise, and nothing is more annoying than being told, "No, you don't really need that" by people who don't have to do this work. But since that approach clearly doesn't work, I'll spend the rest of this article explaining why analysts know what we're talking about.
   
I need to be able to answer complex questions.
    
It's great that your records management system's reporting tools let me count how many crimes we had last quarter, or find a list of all incidents involving a particular person or address. That allows the chief and everyone else in my agency to answer simple questions without having to bother me. But as an analyst, I need to answer far more complex questions--questions that involve both filtering and aggregation, and that involve linking multiple tables, sometimes across multiple databases.
   
For instance, an agency I recently worked with had a shoplifting problem involving teenagers. To help analyze it, I wanted to know what the teens were stealing. Specifically, I wanted a count of each type of item stolen in the last three years for all incidents in which the offense was coded as shoplifting, the person's involvement was coded as "suspect" or "arrestee," and the person's age was less than 19. 
    
Any SQL-based querying tool capable of handling relational data can answer questions across multiple tables, but few records management systems offer such capabilities in their "reporting" screens.
       
Such a query is a cinch using SQL or any number of applications that write SQL for you, but most records management systems simply would not support a query that crosses multiple tables this way, and even if they did, they would not be able to present the result in aggregated form.

Let me throw another wrench into it: Technically, the query I designed in the screenshot above is going to show a bit of misinformation. If more than one teenager was suspected or arrested in the same incident, the stolen property type will be duplicated for as many teens were involved, regardless of how many items of property were actually stolen. To control for this, I need to first design a query that shows me which incidents of shoplifting involved teens, but (in SQL parlance) GROUP BY the case number so that regardless of how many teens were involved, each case number shows up only once. I then need to link this query into a second query asking what items of property were stolen in those incidents. Have you ever seen a records management system that allows you to ask one question, save the answer, and then use it as a filter in a second question? (Or, in database parlance, build a query on top of a query?) If so, please leave me a comment and let me know what system it is. It still won't obviate the need for the analyst to have direct access to the raw data, but I'll be impressed.
       
And even when RMSes allow for complex querying, they often do not offer flexible means of reporting the results, including aggregation.
       
Records systems vary significantly in their querying capabilities, but in my experience, few of them allow the user to:

  • Query multiple tables at once; for instance, I need a list of all white males currently in their 40s, over 6 feet tall, who we've ever suspected or arrested for residential burglary.
  • Allow the user to select what fields appear in the results windows; for instance, in the results screen for the search above, I want to see the burglary modus operandi factors that we track, such as the point and means of entry.
  • Allow the user to aggregate or crosstabulate the results; for instance, I need a crosstabulation by sex and age of everyone we've ever arrested for drunk driving.
  • Allow the user to create data ("expressions") in the process of querying (or at all); for instance, I need to create a field called "quarter" that extracts the quarter of the year from the date, then count the number of offenses by that field.
       
This is why the best vendors don't even try. They might offer a few basic internal searches, but for anything complex, they suggest the user use an external SQL-based reporting tool like Crystal Reports. That's fine with me, as long as you don't mind if I use the same connection to access the data with other technologies.

But if you still want to insist that your RMS allows searches of this complexity internally, that's still not enough because:

I need to be able to synthesize data.

Not all the questions I need to ask can be answered by the CAD and RMS data. Very often, I need to be able to ask questions that cross a variety of tables obtained from different places. Let's look at some examples:
    
  • I've obtained a list of all property pawned in my jurisdiction. I want to see if any items of stolen property in the RMS match the serial numbers of the pawned property. I also want to know if any of the pawners are known burglary or theft offenders.
  • The state offers a website where I can download all the individuals with active warrants. To help prioritize the service of those warrants, I want to identify which of those people on that list are on my list of "Top 50 Offenders," as determined by a separate query of the RMS data.
  • My local prosecutor's office sends me a monthly text file containing adjudications for the month. I want to link this data with the original crime data to run a variety of reports.
  • Here's one I had to do just recently: To help evaluate an agency's drunk driving prevention efforts, I needed to know the number of drunk driving crashes the agency had experienced each year for the past 10 years. Unfortunately, the "alcohol-involved" field in the crash database was unreliable. To get a rough estimate, I had to determine the number of incidents that started as CAD calls for traffic accidents and resulted in arrests or summonses for drunk driving. This required me to join tables from the CAD system to tables in the RMS system, something rarely possible even when the two systems are provided by the same vendor.
       
Even more common, I need to be able to synthesize CAD and RMS data with data that I track myself. It's rare that the CAD and RMS provide all the data that an analyst needs, even if the analyst's question is limited to calls for service and crimes. Many analysts, for instance, code their own modus operandi variables in their own databases. To perform the type of "Top Offender" query described above, I need to be able to assign weights to various offense types. I may have lists of addresses for which I need to run regular queries for administrative reports (all crimes at hotels, all crimes at public housing). If your RMS doesn't automatically assign coordinates to addresses, I need to be able to join addresses to an external "coordinate library." Does your RMS let me create and manage additional tables that I can then join to the standard tables? If so, wow. But it still isn't enough.
          
I need to be able to use data with a variety of technologies.
       
To answer the types of questions I've already offered, I want to be able to connect to the CAD and RMS data with some kind of SQL-based querying tool (I'm a big fan of Microsoft Access). But even if the RMS offered full SQL capabilities in its internal reporting tools, I'd still need to occasionally get the data out of the system. Every once in a while, someone comes along with a mapping, reporting, or statistical tool that does some unique and amazing stuff with data, and I need to be able to feed my data into it. Some examples:

  • CrimeStat, a free application that calculates a variety of spatial statistics, many of which I've never seen in any commercial application
  • Jerry Ratcliffe's Near-Repeat Calculator
  • Joel Caplan's Risk Terrain Modeling software
  • My own threshold database and top offender modeling database (links not offered because I'm in the process of updating them)
          
If you want to use this database to identify and manage intelligence on repeat offenders, you have to be able to get your data into it.
       
Of course, the most common example will be a simple GIS application for mapping. To identify hot spots, determine spatial correlations, and perform other GIS tasks, I have to be able to get the data out of the RMS and into the GIS. No, your internal crime mapping system isn't sufficient. 
     
Keep in mind that being able to extract the data for use in these technologies isn't as simple as an "export" button that sends some of the most common fields to a flat file. I need to use the external applications with all of the querying complexity described above. I may need to map all juvenile burglaries with televisions stolen. I may need to map all crimes committed by individuals on my "Top Offender" list. Simple exports don't do the job--only access to the full dataset in its original relational format does the job.

And speaking of the need to copy data out of the original database . . . 

I need to change data.
    
I hope this doesn't come as any surprise, but your data sucks (another topic to be explored in more detail another day). Addresses are wrong. Time fields aren't filled in. Crimes are mis-coded. The same people appear multiple times in the master name index. Out of laziness or impatience or insufficient choices in the libraries, officers have coded "Other" or "Miscellaneous" for substantial percentages of your location types and property types. I can fix a lot of these errors with update queries, but I suspect you're not keen on my running those queries on the production database. This is another reason that I need to be able to extract the data. All of it.

I hope I've made my case, but let's deal with some . . . 

Common Objections

  • "Our data structures are proprietary." Oh, get over yourself. Tables, fields, and the relationships between them are boring. There are ways to make them more efficient, sure, but nothing that isn't taught in any Databases 101 course. If your RMS has any value, it's in the application that works with the data, not in the structure of the tables. I've inspected the structures of dozens of records management systems and have never said, "Ooh, they put an index on the 'DOB' field. I shall steal this trade secret and make millions."
  • "Analysts could change or delete records through the ODBC connection." Sure, if you set it up as a two-way connection. Don't do it that way. And if you're really concerned about it, set up a replication database and have the analysts connect to that.
  • "Analysts need to be restricted in what data they can access." There are some legitimate concerns here, such as if the RMS is regional and the participating agencies don't want analysts from other agencies to have open access to all their data. But this is easily overcome with user permissions or by setting up a replication database that only replicates what the analyst is allowed to access. 
  • "Open access puts a strain on the performance of the system." This flies in the face of the experience of many large agencies where analysts are running frequent queries with millions of records. But again, a replication database solves this problem.
  • "There are hundreds of tables in the database. It's too complex for the analysts to figure out." Give them a chance. They're not stupid. (If they are, fix your hiring practices.) Sure, it will take time, but it's time worth spending. Analysts have access to peer support and a variety of free technical support resources to help them.        
        
Let me offer a universal counter to all objections: This is done all the time. Yes, I have couched this issue (analysts not having access to their data) as a quite common and age-old problem. It probably affects at least 50% of full-time crime analysts. But the other 50% are working for agencies in which these objections have been overcome, or were never raised in the first place. There's nothing so unique about your agency or your RMS that you can't figure it out with a little effort.

If anyone has additional objections or new arguments for any of the above, please offer them in the comments, and I'll respond to them individually.
     
While I'm at it . . .
    
IT directors, it's your job to support the industry standard software that analysts need to do their jobs, not dictate what software they're allowed to use. I'll offer a separate entry on why Microsoft Access is an ideal solution for a lot of crime analysis tasks, but for now it would just be nice if you'd stop deleting it from your analysts' computers. It's practically free, it has more capabilities than you think, analysts use it successfully even at very large agencies, and, no, Microsoft is not "moving away from it." Your prejudice against it is, and always has been, irrational.
       
In Summary:

  • Records management vendors: Have a solution ready for your clients. Don't make them beg and threaten. Public safety is at stake. Whatever solutions you think you offer short of direct access to the full dataset (or a replica thereof), I promise you, it's insufficient. If I didn't provide enough examples here, I can provide more. It's not your fault. No vendor could possibly anticipate all the ways an analyst might need to work with data.   
  • IT staff: Try to remember that your role is to support your agency's needs, not dictate those needs. Stop blocking analysts from the material they need to do their jobs. Public safety is at stake.
  • Police executives: Support your analysts' efforts to get satisfaction from your RMS vendor and your IT staff. Demand that any prospective RMS vendor tell you how analysts will be able to access the data. Public safety is at stake.
  • Analysts: Never stop fighting. And make sure you're working on your data skills so that when you do have access to your full dataset, you actually know what to do with it. Public safety is at stake.

Feel free to leave comments with your experiences getting direct access to data from specific systems and vendors. 

Saturday, August 27, 2022

The More Things Change . . .

I couldn't find a picture from my first conference, but here I am at MACA in 1999. I'm sure I was talking about data quality and staffing issues.
       
This fall marks my 25th anniversary of attending International Association of Crime Analysts conferences. I attended my first in Milwaukee, Wisconsin in 1997 and haven't missed one since. Our field offers no better forum for training, networking, and professional development. 

I had no idea what to expect the first year I attended. What I found was a community of like-minded professionals who, although from different states and countries, faced many of the same issues that I was then facing at the Cambridge (MA) Police Department. It was such a relief to talk with them about common methods, obstacles, and frustrations. Twenty-five years later, these same discussions are taking place, which is good. What's less good is that those discussions still concern many of the same topics as they did in 1997. There's a certain inevitability to this; in the same way that ontogeny was once thought to recapitulate phylogeny, an analyst's individual struggles sometimes recapitulate the profession's collective struggles.

But it still feels that with the training, literature, and knowledge that the field has developed in the last three decades, we should have been able to eliminate, or at least ameliorate, some of the more common issues. Here, I present 10 common issues that I saw among the sessions and discussions at this year's IACA conference. Some of them were voiced by analysts themselves, but some of them are my own criticisms of the field. In each case, I have tried to suggest potential solutions. 

1. Difficulty accessing data. The persistence of this issue is particularly frustrating for me because most of my consulting over the past 15 years has tried to address it. This topic will be the subject of its own article on "Modern Crime Analysis" before long. It shouldn't be that hard: Analysts need raw data the way chefs need raw ingredients, the way painters need unadulterated pigments. Expecting them to work without is like expecting great art from a paint-by-numbers kit or a great meal from a TV dinner. If you're not willing to give them what they need, you might as well not hire analysts in the first place. Yet everywhere we go, there is no end to the stories of IT departments refusing to allow ODBC access, RMS vendors refusing to supply data dictionaries, and cloud-based RMS solutions with no provision for extracting the data. And in all these places, condescending non-analysts try to tell analysts that they don't really need what they say they need. 

Solutions: The solution to this one probably lies in training and literature for executives, and perhaps for IT professionals, too, although I'm beginning to think they're a lost cause. Short of that, there are sound arguments for analysts to make, and I worry that some of them just aren't persistent enough. Again, more resources on this soon.
      
Answering complex questions about offenses, people, and property requires a) the ability to access data in its original raw, relational format, and b) an understanding of how relational database systems work.
        
2. A lack of skill and understanding when it comes to relational data. On the other hand, if we're going to demand access to raw data, we should know what to do with it. It baffles me how many analysts seem to think they can do everything they need with Microsoft Excel. Excel is a powerful tool, sure, but it handles data from only one table at a time. You can't possibly answer all the questions you need to answer with a flat-file system, and I suspect that many analysts simply aren't answering those more complex questions. I'm not here to force any specific technology on analysts. You can use Microsoft Access, Crystal Reports, Tableau, or directly query the data through some kind of SQL-based data studio. But if you're not using one of these technologies, you're not working with relational data at all, which is concerning.

Solutions: Better training and literature focused on this topic specifically; solving problem #1 so analysts actually have additional data to work with. Working with relational data probably should have greater prominence in the IACA's certification standards. Finally, more agencies should probably test for this skill during hiring, as I find that many people, even otherwise intelligent ones, have a tough time grasping relational data concepts.

3. Agencies not using what analysts produce“I identified a pattern of robberies at Vietnamese restaurants after the fourth incident. I put out a bulletin about it with all the commonalities, but my department didn’t do anything with it. They didn’t even contact Vietnamese restaurants to warn them.” With a minor change in details, this is a near-direct quote from the Chicago conference. I seem to hear an identical story every year and, particularly from my early days, have plenty of them to tell. During the predictive policing era, my greatest concern was not that we wouldn’t be able to accurately predict crime but that the predictions wouldn’t matter because agencies wouldn’t implement operational practices that required them to do something about them. Think about your own agency. Even if you could predict with 100% certainty that the Vietnamese restaurant at the corner of Main and Elm would be robbed tomorrow night at exactly 21:30, is there an equal certainty that the crime would be prevented? How? Who would take responsibility for acting on that information? What would compel them to do so?
      
Solutions: This is perhaps one common complaint that has gotten better over the years, but it is still all-too common. The solution likely lies in strategies like Rachel Santos’s “stratified model” and other policies that ensure actionable intelligence is acted upon. A good CompStat-style system would also ensure action. Government agencies like the Bureau of Justice Assistance have done a great job promoting training and technical assistance to implement such models, but of course that only helps the U.S., and even here only those agencies that pay attention.
     
4. Analysts only producing what they’re asked for. At least there’s a “success story” embedded in the example above: the analyst identified the pattern within four incidents and alerted the department. I’d rather a situation in which the agency has the information it needs but chooses not to act than one in which it cannot act because it doesn’t have the necessary information. Unfortunately, my discussions suggest that too many of our colleagues produce only what their agencies request of them, which ensures that crime analysis in those agencies devolves into nothing but administrative analysis and investigative support. Crime analysis must be comprehensive and proactive. A complete portfolio of crime analysis services includes regular analyses of hot spots, emerging patterns, long-term problems, and repeat offenders and criminal organizations. (Of course, I mean the above for traditional police agencies; there are other criminal justice agencies that do not have a mandate for things like patrol and crime prevention, and analysts in those agencies have to focus their services on what their agencies can actually do.) I grant that there is some room for doing an “introductory” product in some of these areas if you know that your agency won’t use a more exhaustive one, but a lack of agency action in the past cannot justify withholding analytical services on an ongoing basis.
     
Solutions: Whether we use my “4P” model (people, places, patterns, and problems) or the four traditional types of crime analysis (tactical, strategic, intelligence, and administrative), our literature and training should perhaps focus on more holistic approaches to crime analysis. This is another area where more executive training would likely produce results, too. This problem goes away if executives are demanding a greater variety of analytical services.

I'll discuss my "4P" approach to analysis and crime reduction in a later entry, but it requires some focus on each of four areas.
          
5. Poor analytical staffing. Of course, enjoying the comprehensive suite of crime analysis products that I listed above requires that the agency staff itself with enough analysts to produce them. At the conference, I met analysts who were the only analysts for agencies with hundreds of police officers. It isn’t hard: one analyst per 1,500 UCR Part 1 crimes or every 2,800 NIBRS Group A crimes gives agencies what they need.
          
Solutions: First, we need comparable formulas for countries other than the U.S. Second, since some agencies won't stop asking where those numbers “come from” and how they were “determined," it would perhaps be nice if someone funded a research study that could validate them (or similar figures); I’d be glad to help. Third, I would like to see the government funding more regional crime analysis initiatives to bring the benefits of crime analysis to agencies that come nowhere near the 1,500 figure.
         
6. Limited focus on preventative solutions. I understand that there’s a certain amount of excitement that comes from “catching the bad guy,” and that being able to point to an offender jailed because of your work sometimes offers more visceral pleasure than a nebulous 15% drop in property crime. Still, I swiftly get depressed with too much talk about sweeps, busts, stings, raids, surveillance, social media monitoring, facial recognition, and other purely offender-focused solutions, particularly in an era that encourages us to be cautious with heavy-handed enforcement. Solutions like CPTED, hot spot policing, community policing, problem-oriented policing, and focused deterrence haven’t stopped working just because they’re old. They are the keys to long-term crime reduction, which not only protects victims but prevents offenders from becoming offenders in the first place. The opening session had some great material on Community Violence Intervention (CVI), and then I never heard anyone talk about it for the rest of the conference.
        
Solutions: Part of the problem here is that crime analysts have often been ignored by those writing the training and literature on the strategies mentioned above. Someone has to map the hot spots, analyze the problems, and list the top offenders. I realize that criminologists like to do those things for pilot projects, but those responsibilities must be transitioned to crime analysts if they’re going to become permanent parts of an agency’s strategic repertoire.
        
Did anyone hear "CPTED" even once this week?
        
7. Data quality. I think we can all agree on this one. It doesn’t seem to matter how much agencies are separated by time and space, they’re all dealing with invalid addresses, mis-coded offenses, blank incident dates and times, duplicate master names, stolen property entered in the narrative instead of the property module, and modus operandi factors ignored entirely.
          
Solutions: I’m still waiting for the RMS that’s as easy to use as TurboTax, walking the user through each important question one at a time, using skip logic to take the user to vital fields while ignoring others, and offering contextual help. I’m also waiting for a model agency that staffs a centralized records function with data quality responsibilities instead of leaving the issue to hundreds of individual officers and their beleaguered supervisors. I hoped that NIBRS standards would help, but they seem to just encourage officers to deliberately mis-code or under-code crimes to avoid the associated error messages.
          
A screen shot from the "master name" index of an agency's RMS. These are all the same person.
        
8. Limited analytical training. There are two issues here, the first exemplified by a colleague who has been in the field for 25 years, but who was only attending the IACA conference for the first time this year. There were many, many more analysts who got a lot out of the Chicago conference but who left simply assuming that they wouldn’t be back for several years, if at all. To suggest that analysts have small training budgets would erroneously suggest that in most agencies, they have any training budgets at all.
         
But this is also an issue for analysts. Conferences are fun and exciting, and they may boost morale, but we’ve long passed the point at which they’re absolutely necessary for training, networking, and professional development. There are more than enough websites, books, webinars, online courses, YouTube videos, toolkits, and Zoom meetings out there to form a robust professional development plan without having to fly to an exotic city every year. Yet while I meet many analysts who agree that they have skill and knowledge gaps to fill, I rarely meet any who have an organized way of tracking their professional development needs and seeking out the appropriate training sources.
          
Solutions: During the conference, there was some discussion about creating more training and literature specific to crime analysis supervisors and managers. I’m all in favor of this, and one standard that I’d like to see adopted by this effort is the idea of 10% professional development time. That is, analysts should engage in an average 4 hours of professional development per week. If I were a supervisor, I would require each analyst in my employ to block out those 4 hours and keep an ongoing, prioritized list of the things they planned to do with that time. Such documentation could then fuel requests for more cost-intensive training, such as conferences. It’s much easier to get approved for a conference if you can say, “Look, learning how to automate certain data tasks is my top priority, and this conference offers six sessions that cover related topics” than if you just say, "I want to go to Chicago."
         
The IACA has more than 100 archived webinars on its site, offering an amount of training equivalent to three or four conferences.
        
9. Products of limited actionability. As usual, the conference showcased a lot of pretty products—bulletins and reports for which the analyst had done a bang-up job in Microsoft Word or Publisher and had clearly spent a lot of time on headers and page borders and tables and colorful charts. I really enjoy seeing what’s happening with “dashboards” and their various dials and meters. But I’ll never forget the lesson that Tom Casady, then chief of the Lincoln (NE) Police Department, gave us in 2008 when he contrasted several award-winning bulletins with some information scrawled on the back of a coffee-stained envelope and explained that as a police executive, the envelope was the more useful of the two products.
     
Chief Casady's idea of an actionable product.
           
Comprehensive crime analysis products must answer the five classic “Ws” (who, what, when, where, and how) as well as two additional, vital questions: “so what?” and “now what?” If a police officer, detective, crime prevention officer, and commander can’t look at the product and make a targeted decision about how to act on it, it doesn’t matter how pretty it is. Dashboards are useful tools if they prompt deeper analysis, worthless otherwise.
         
Solutions: Having been an IACA bulletin contest judge for two or three years, I wouldn’t mind if the criteria were modified a bit to focus on a bit more substance over style. (And I don’t mean to disparage any of the winners; there were plenty of examples of products that had both.) I’d like to see more case study presentations and webinars that draw a direct line from the analytical product to a chosen action, and perhaps more research into best practices in this area.
       
10. Poor organizational location of analysts. “My sergeant won’t let me attend [my local association’s meetings],” one analyst told me. “He says they take up too much time.” The very phrase boiled my blood, although not in this case because of the refusal of training. Rather, it was the first two words. There is no crime analyst on God’s green earth that deserves to be supervised by a sergeant. The top crime analyst of an agency (including the sole analyst in a one-analyst agency) ought to have a civilian rank equivalent of at least a second-level supervisor (a lieutenant in most U.S. departments).
         
I saved this one for last because the only evidence I can offer here is experiential and anecdotal. Nonetheless, from my near-30 years in the field and my 15 years of consulting with hundreds of agencies, it is my firm belief that:
        
  • The crime analysis function of a department should be organizationally located before the organizational chart branches off into its various major divisions (patrol, investigations, administrative or service), ideally reporting to a member of the executive staff or the chief executive himself or herself, thus ensuring that the analysts are there to serve everyone.
  • If the agency is large enough to require analysts in individual precincts or stations, they ought to respond daily to the needs of that station’s commander but administratively to an internal crime analysis management structure.
        
Solutions: First, more research to help validate or refute the above. Second, I guess we're back to executive training again.

The organizational chart from my old agency shows the crime analysis function in the ideal location.
       
I was tempted to write about several other issues, including analysts never looking at CAD data, physical placement of analysts, and the unfortunate intrusion of political ideologies into what ought to be a sober examination of facts, but these issues came up in single discussions, so I'm less convinced about their applicability to a large part of the field.

What other age-old problems in crime analysis should we have just solved by now, and what do you see as the solutions for them?