Saturday, August 27, 2022

The More Things Change . . .

I couldn't find a picture from my first conference, but here I am at MACA in 1999. I'm sure I was talking about data quality and staffing issues.
       
This fall marks my 25th anniversary of attending International Association of Crime Analysts conferences. I attended my first in Milwaukee, Wisconsin in 1997 and haven't missed one since. Our field offers no better forum for training, networking, and professional development. 

I had no idea what to expect the first year I attended. What I found was a community of like-minded professionals who, although from different states and countries, faced many of the same issues that I was then facing at the Cambridge (MA) Police Department. It was such a relief to talk with them about common methods, obstacles, and frustrations. Twenty-five years later, these same discussions are taking place, which is good. What's less good is that those discussions still concern many of the same topics as they did in 1997. There's a certain inevitability to this; in the same way that ontogeny was once thought to recapitulate phylogeny, an analyst's individual struggles sometimes recapitulate the profession's collective struggles.

But it still feels that with the training, literature, and knowledge that the field has developed in the last three decades, we should have been able to eliminate, or at least ameliorate, some of the more common issues. Here, I present 10 common issues that I saw among the sessions and discussions at this year's IACA conference. Some of them were voiced by analysts themselves, but some of them are my own criticisms of the field. In each case, I have tried to suggest potential solutions. 

1. Difficulty accessing data. The persistence of this issue is particularly frustrating for me because most of my consulting over the past 15 years has tried to address it. This topic will be the subject of its own article on "Modern Crime Analysis" before long. It shouldn't be that hard: Analysts need raw data the way chefs need raw ingredients, the way painters need unadulterated pigments. Expecting them to work without is like expecting great art from a paint-by-numbers kit or a great meal from a TV dinner. If you're not willing to give them what they need, you might as well not hire analysts in the first place. Yet everywhere we go, there is no end to the stories of IT departments refusing to allow ODBC access, RMS vendors refusing to supply data dictionaries, and cloud-based RMS solutions with no provision for extracting the data. And in all these places, condescending non-analysts try to tell analysts that they don't really need what they say they need. 

Solutions: The solution to this one probably lies in training and literature for executives, and perhaps for IT professionals, too, although I'm beginning to think they're a lost cause. Short of that, there are sound arguments for analysts to make, and I worry that some of them just aren't persistent enough. Again, more resources on this soon.
      
Answering complex questions about offenses, people, and property requires a) the ability to access data in its original raw, relational format, and b) an understanding of how relational database systems work.
        
2. A lack of skill and understanding when it comes to relational data. On the other hand, if we're going to demand access to raw data, we should know what to do with it. It baffles me how many analysts seem to think they can do everything they need with Microsoft Excel. Excel is a powerful tool, sure, but it handles data from only one table at a time. You can't possibly answer all the questions you need to answer with a flat-file system, and I suspect that many analysts simply aren't answering those more complex questions. I'm not here to force any specific technology on analysts. You can use Microsoft Access, Crystal Reports, Tableau, or directly query the data through some kind of SQL-based data studio. But if you're not using one of these technologies, you're not working with relational data at all, which is concerning.

Solutions: Better training and literature focused on this topic specifically; solving problem #1 so analysts actually have additional data to work with. Working with relational data probably should have greater prominence in the IACA's certification standards. Finally, more agencies should probably test for this skill during hiring, as I find that many people, even otherwise intelligent ones, have a tough time grasping relational data concepts.

3. Agencies not using what analysts produce“I identified a pattern of robberies at Vietnamese restaurants after the fourth incident. I put out a bulletin about it with all the commonalities, but my department didn’t do anything with it. They didn’t even contact Vietnamese restaurants to warn them.” With a minor change in details, this is a near-direct quote from the Chicago conference. I seem to hear an identical story every year and, particularly from my early days, have plenty of them to tell. During the predictive policing era, my greatest concern was not that we wouldn’t be able to accurately predict crime but that the predictions wouldn’t matter because agencies wouldn’t implement operational practices that required them to do something about them. Think about your own agency. Even if you could predict with 100% certainty that the Vietnamese restaurant at the corner of Main and Elm would be robbed tomorrow night at exactly 21:30, is there an equal certainty that the crime would be prevented? How? Who would take responsibility for acting on that information? What would compel them to do so?
      
Solutions: This is perhaps one common complaint that has gotten better over the years, but it is still all-too common. The solution likely lies in strategies like Rachel Santos’s “stratified model” and other policies that ensure actionable intelligence is acted upon. A good CompStat-style system would also ensure action. Government agencies like the Bureau of Justice Assistance have done a great job promoting training and technical assistance to implement such models, but of course that only helps the U.S., and even here only those agencies that pay attention.
     
4. Analysts only producing what they’re asked for. At least there’s a “success story” embedded in the example above: the analyst identified the pattern within four incidents and alerted the department. I’d rather a situation in which the agency has the information it needs but chooses not to act than one in which it cannot act because it doesn’t have the necessary information. Unfortunately, my discussions suggest that too many of our colleagues produce only what their agencies request of them, which ensures that crime analysis in those agencies devolves into nothing but administrative analysis and investigative support. Crime analysis must be comprehensive and proactive. A complete portfolio of crime analysis services includes regular analyses of hot spots, emerging patterns, long-term problems, and repeat offenders and criminal organizations. (Of course, I mean the above for traditional police agencies; there are other criminal justice agencies that do not have a mandate for things like patrol and crime prevention, and analysts in those agencies have to focus their services on what their agencies can actually do.) I grant that there is some room for doing an “introductory” product in some of these areas if you know that your agency won’t use a more exhaustive one, but a lack of agency action in the past cannot justify withholding analytical services on an ongoing basis.
     
Solutions: Whether we use my “4P” model (people, places, patterns, and problems) or the four traditional types of crime analysis (tactical, strategic, intelligence, and administrative), our literature and training should perhaps focus on more holistic approaches to crime analysis. This is another area where more executive training would likely produce results, too. This problem goes away if executives are demanding a greater variety of analytical services.

I'll discuss my "4P" approach to analysis and crime reduction in a later entry, but it requires some focus on each of four areas.
          
5. Poor analytical staffing. Of course, enjoying the comprehensive suite of crime analysis products that I listed above requires that the agency staff itself with enough analysts to produce them. At the conference, I met analysts who were the only analysts for agencies with hundreds of police officers. It isn’t hard: one analyst per 1,500 UCR Part 1 crimes or every 2,800 NIBRS Group A crimes gives agencies what they need.
          
Solutions: First, we need comparable formulas for countries other than the U.S. Second, since some agencies won't stop asking where those numbers “come from” and how they were “determined," it would perhaps be nice if someone funded a research study that could validate them (or similar figures); I’d be glad to help. Third, I would like to see the government funding more regional crime analysis initiatives to bring the benefits of crime analysis to agencies that come nowhere near the 1,500 figure.
         
6. Limited focus on preventative solutions. I understand that there’s a certain amount of excitement that comes from “catching the bad guy,” and that being able to point to an offender jailed because of your work sometimes offers more visceral pleasure than a nebulous 15% drop in property crime. Still, I swiftly get depressed with too much talk about sweeps, busts, stings, raids, surveillance, social media monitoring, facial recognition, and other purely offender-focused solutions, particularly in an era that encourages us to be cautious with heavy-handed enforcement. Solutions like CPTED, hot spot policing, community policing, problem-oriented policing, and focused deterrence haven’t stopped working just because they’re old. They are the keys to long-term crime reduction, which not only protects victims but prevents offenders from becoming offenders in the first place. The opening session had some great material on Community Violence Intervention (CVI), and then I never heard anyone talk about it for the rest of the conference.
        
Solutions: Part of the problem here is that crime analysts have often been ignored by those writing the training and literature on the strategies mentioned above. Someone has to map the hot spots, analyze the problems, and list the top offenders. I realize that criminologists like to do those things for pilot projects, but those responsibilities must be transitioned to crime analysts if they’re going to become permanent parts of an agency’s strategic repertoire.
        
Did anyone hear "CPTED" even once this week?
        
7. Data quality. I think we can all agree on this one. It doesn’t seem to matter how much agencies are separated by time and space, they’re all dealing with invalid addresses, mis-coded offenses, blank incident dates and times, duplicate master names, stolen property entered in the narrative instead of the property module, and modus operandi factors ignored entirely.
          
Solutions: I’m still waiting for the RMS that’s as easy to use as TurboTax, walking the user through each important question one at a time, using skip logic to take the user to vital fields while ignoring others, and offering contextual help. I’m also waiting for a model agency that staffs a centralized records function with data quality responsibilities instead of leaving the issue to hundreds of individual officers and their beleaguered supervisors. I hoped that NIBRS standards would help, but they seem to just encourage officers to deliberately mis-code or under-code crimes to avoid the associated error messages.
          
A screen shot from the "master name" index of an agency's RMS. These are all the same person.
        
8. Limited analytical training. There are two issues here, the first exemplified by a colleague who has been in the field for 25 years, but who was only attending the IACA conference for the first time this year. There were many, many more analysts who got a lot out of the Chicago conference but who left simply assuming that they wouldn’t be back for several years, if at all. To suggest that analysts have small training budgets would erroneously suggest that in most agencies, they have any training budgets at all.
         
But this is also an issue for analysts. Conferences are fun and exciting, and they may boost morale, but we’ve long passed the point at which they’re absolutely necessary for training, networking, and professional development. There are more than enough websites, books, webinars, online courses, YouTube videos, toolkits, and Zoom meetings out there to form a robust professional development plan without having to fly to an exotic city every year. Yet while I meet many analysts who agree that they have skill and knowledge gaps to fill, I rarely meet any who have an organized way of tracking their professional development needs and seeking out the appropriate training sources.
          
Solutions: During the conference, there was some discussion about creating more training and literature specific to crime analysis supervisors and managers. I’m all in favor of this, and one standard that I’d like to see adopted by this effort is the idea of 10% professional development time. That is, analysts should engage in an average 4 hours of professional development per week. If I were a supervisor, I would require each analyst in my employ to block out those 4 hours and keep an ongoing, prioritized list of the things they planned to do with that time. Such documentation could then fuel requests for more cost-intensive training, such as conferences. It’s much easier to get approved for a conference if you can say, “Look, learning how to automate certain data tasks is my top priority, and this conference offers six sessions that cover related topics” than if you just say, "I want to go to Chicago."
         
The IACA has more than 100 archived webinars on its site, offering an amount of training equivalent to three or four conferences.
        
9. Products of limited actionability. As usual, the conference showcased a lot of pretty products—bulletins and reports for which the analyst had done a bang-up job in Microsoft Word or Publisher and had clearly spent a lot of time on headers and page borders and tables and colorful charts. I really enjoy seeing what’s happening with “dashboards” and their various dials and meters. But I’ll never forget the lesson that Tom Casady, then chief of the Lincoln (NE) Police Department, gave us in 2008 when he contrasted several award-winning bulletins with some information scrawled on the back of a coffee-stained envelope and explained that as a police executive, the envelope was the more useful of the two products.
     
Chief Casady's idea of an actionable product.
           
Comprehensive crime analysis products must answer the five classic “Ws” (who, what, when, where, and how) as well as two additional, vital questions: “so what?” and “now what?” If a police officer, detective, crime prevention officer, and commander can’t look at the product and make a targeted decision about how to act on it, it doesn’t matter how pretty it is. Dashboards are useful tools if they prompt deeper analysis, worthless otherwise.
         
Solutions: Having been an IACA bulletin contest judge for two or three years, I wouldn’t mind if the criteria were modified a bit to focus on a bit more substance over style. (And I don’t mean to disparage any of the winners; there were plenty of examples of products that had both.) I’d like to see more case study presentations and webinars that draw a direct line from the analytical product to a chosen action, and perhaps more research into best practices in this area.
       
10. Poor organizational location of analysts. “My sergeant won’t let me attend [my local association’s meetings],” one analyst told me. “He says they take up too much time.” The very phrase boiled my blood, although not in this case because of the refusal of training. Rather, it was the first two words. There is no crime analyst on God’s green earth that deserves to be supervised by a sergeant. The top crime analyst of an agency (including the sole analyst in a one-analyst agency) ought to have a civilian rank equivalent of at least a second-level supervisor (a lieutenant in most U.S. departments).
         
I saved this one for last because the only evidence I can offer here is experiential and anecdotal. Nonetheless, from my near-30 years in the field and my 15 years of consulting with hundreds of agencies, it is my firm belief that:
        
  • The crime analysis function of a department should be organizationally located before the organizational chart branches off into its various major divisions (patrol, investigations, administrative or service), ideally reporting to a member of the executive staff or the chief executive himself or herself, thus ensuring that the analysts are there to serve everyone.
  • If the agency is large enough to require analysts in individual precincts or stations, they ought to respond daily to the needs of that station’s commander but administratively to an internal crime analysis management structure.
        
Solutions: First, more research to help validate or refute the above. Second, I guess we're back to executive training again.

The organizational chart from my old agency shows the crime analysis function in the ideal location.
       
I was tempted to write about several other issues, including analysts never looking at CAD data, physical placement of analysts, and the unfortunate intrusion of political ideologies into what ought to be a sober examination of facts, but these issues came up in single discussions, so I'm less convinced about their applicability to a large part of the field.

What other age-old problems in crime analysis should we have just solved by now, and what do you see as the solutions for them?