We often talk with our field and global partners about the importance of the analysis stage in a profiling exercise. Analysis, like profiling in general, is a process. It is a series of steps that require a mix of careful number crunching, intensive reading, focused writing, and lots and lots of discussion with partners to find consensus on what the data tells us. This helps you get from raw data, the bits and pieces you get in the field, to evidence that has meaning and that partners can then use for their decision-making.
Analysis is often seen as something scary and complex, best “left to the experts”. Similarly, there are different types of analysis that produce diverse outputs, but many partners struggle to understand which might best respond to their information needs in a specific displacement situation.
We are looking for ways this year to ‘demystify’ this process in order to improve our practice and capacity to support our partners in the field, and to help them better understand each analysis approach. As a first step, we organised an internal workshop to review our current analysis techniques, frameworks and outputs, with a specific focus on “durable solutions analysis”, “urban analysis”, “joint inter-sectoral analysis” and “joint analysis”.
While we are still working on consolidating the insights from the workshop discussions, we would like to take this opportunity to look at some of the basics when it comes to the collaborative analysis of data: the building blocks of an analysis, key challenges you might come across in the field and how to overcome them.
At JIPS, we support our field partners in analysing displacement situations and informing joint responses through a common evidence-base. While the analysis approach may vary depending on context and the specific objectives of different profiling exercises, there are certain shared features: these are “building blocks” of a collaborative analysis of displacement data.
These elements are built on our field experience, for example analysing the situation of damaged and displacement-affected cities in Syria, informing planning towards sustainable solutions in Sudan, and measuring integration of displaced populations in Thessaloniki, Greece, but also on the valuable opportunities we have had to exchange with technical partners like ACAPS and OCHA’s Needs Assessment and Analysis Section.
ACAPS has prioritised building the analysis capacity in the humanitarian sector through its’ Advanced Humanitarian Analysis Programme in 2018, to which we contributed substantially through specific modules on collaborative approaches as well as stakeholder engagement for joint analysis.
There is no one definition that is agreed upon by the wider community of humanitarian and development practitioners.
At JIPS, however, we understand analysis essentially as the process of taking something complex and breaking it down into simpler pieces. It’s about identifying patterns and trends to create meaning from data to answer specific questions for a specific purpose. We do not analyse for the sake of analysing, but to inform specific decisions for supporting displaced populations and joint responses to displacement.
What constitutes good analysis? At JIPS, we believe that the more systematic and collaborative our process is for identifying patterns and trends, and the better the data itself, the better we can answer the questions that we set out to resolve.
It’s about identifying patterns and trends to create meaning from data to answer specific questions for a specific purpose.
What does systematic analysis look like? This is where ACAPS’ analytical spectrum (see infographic below) comes into play: the first step in the process would consist in describing the data – summarising and comparing it. In the next step, one would try to explain why a situation is the way it is, whether this makes sense and how it may compare to a pre-crisis situation. The third and fourth steps tend to be more challenging: interpreting the data and anticipating what might happen next.
While the descriptive and explanatory stages can be done by one or two individuals, the latter two stages – interpretation and anticipation – are much better done jointly by bringing together critical stakeholders and diverse expertise.
In JIPS’ field of displacement profiling, we aim to go through each of these steps in order to come up with a rich, robust and agreed-upon analysis. For that purpose, we typically recommend to profiling partners that they hold joint workshops to discuss and agree on the results of each analysis stage. Nevertheless, the reality on the ground often makes this difficult. Challenges include:
Given the challenges, just reaching a shared understanding of a descriptive analysis of the situation can take a lot of time, discussion and compromise. But this may be good enough to meet the objectives of the exercise, and reaching agreement on the analysis done is key to the results being accepted by the partners.
Emphasising the collaborative process and keeping stakeholders engaged throughout the analysis phase helps to ensure that partners remain on board when it comes time to review and question the data.
We have developed various strategies over the years for overcoming these challenges. Emphasising the collaborative process and keeping stakeholders engaged throughout the analysis phase helps to ensure that partners remain on board when the time comes to review and question the data. This is important, because the data does not always conform to our assumptions.
Indeed, the point of collecting information is to challenge those assumptions and to get a more accurate picture of the situation. The more the partners can work together to understand and then interrogate the data, the stronger the analysis, and the stronger the consensus in the end. This is what makes profiling evidence not only robust but also useful.
Joint analysis is one of our key priorities as part of our strategy 2018-2020 because we believe that through shared data and analysis, humanitarian and development can jointly make sense of data and coherently plan their responses.
As a next step to our internal workshop, we are looking into consolidating the lessons learned from this year’s and past exercises, in order to refine the concepts we use and improve our support. We will make sure to share progress on this work through future articles, so stay tuned for more on analysis in profiling.
—
What has been your experience in trying to pursue an effective analysis ? We’d love to hear about your work and get your feedback on ours: get in touch at info@jips.org or find us on Facebook, Twitter, or LinkedIn.