The Evidence Portal is a publicly available website that contains quality research evidence.
It has been established so that organisations delivering human services in NSW are able to access and apply evidence in the design and delivery of their programs. For further information see About the Portal.
Currently the Portal contains a number of types of evidence:
In the future the Portal will also contain emerging research evidence.
The research evidence available on the Portal can be used in a number of ways, including the following:
Research evidence is only one form of evidence.
Experienced practitioners have vital knowledge about the families, communities and the service systems within which they work. Effective services will incorporate practitioner expertise, in both design, and particularly implementation.
Likewise, effective service design and implementation will reflect the lived experiences of clients, their values and preferences. Incorporating client voice helps to prevent avoidable harm and results in better client outcomes. Creating a space where clients' voices are heard and can directly influence service design, as well as the services they receive will ensure services are tailored and more likely to be accessed and effective.
The evidence-informed programs are identified through high quality threshold evidence reviews conducted on specific topics. Each evidence review follows a strict process to search for, screen and assess research and evaluations to identify high-quality, evidence-informed programs.
See the Technical Specifications for more information about the process followed.
These specifications provide detailed guidance, explanations and examples to ensure our evidence reviews are systematic, rigorous and transparent.
The rating a program receives depends on:
The approach for rating evidence considered and adapted the method of other publicly available evidence rating scales, including the Early Intervention Foundation Evidence Standards and the What Works Clearinghouse Procedures and Standards Handbook (Version 4.0) (United States Department of Education, 2017).
See the Technical Specifications for more information about the Evidence Rating Scale.
Core components are extracted from evidence-informed programs.
After an evidence review has identified and rated evidence-informed programs, core components can be extracted from them. This involves closely examining and grouping the types of activities (core components) that are undertaken as part of each program. The way these activities are implemented is also captured. These are the flexible activities within each core component.
See Section 2.7 of the Technical Specifications for further detail and examples of identifying core components and flexible activities.
See Using a core components approach for more information.
Detailed information about each evidence review we have conducted is on our evidence reviews page.
Alternatively, you can email us: EvidencePortal@dcj.nsw.gov.au
We want to make sure the Evidence Portal includes the most up-to-date research. We plan on updating the information about evidence-informed programs and core components periodically to include new research and evaluations that have been conducted and published.
When working in diverse communities with complex circumstances and changing needs, it’s important we implement services that are flexible and tailored to local needs. An evidence base of purely manualised ‘off the shelf’ programs may inhibit such flexibility. In addition, the cost of manualised programs is often prohibitive.
As such, a core components approach has been taken to organise the evidence in a way that is meaningful and easily applicable to existing programs and services. For further information see Using a core components approach.
Each program that meets the critera for inclusion is given a ‘strength of evidence’ rating. This helps us understand the quality and volume of evidence that sits behind each program.
There are 7 different ratings a program can receive:
For more information see: Understanding the Evidence Rating Scale.
Programs that are determined to have a positive effect on at least one client outcome are included on the Evidence Portal as evidence-informed programs.
Effectiveness refers to the ability of a program to achieve positive client outcomes. Each program in the Evidence Portal is identified has having a positive, negative, neutral or mixed effect on client outcomes.
A positive effect means the program was able to improve client outcomes – positive changes occurred.
A neutral effect means the program did not impact client outcomes – they stayed the same.
A negative effect means the program had an adverse effect on client outcomes – they got worse.
A mixed effect means the client outcomes were a combination of positive, negative and/or neutral.
The evidence portal only includes evidence-informed programs – that is, programs identified in the review that were found to have a positive effect on at least one client outcome.
Programs for which evidence fails to demonstrate effect or which evidence demonstrates adverse effects are not included in the Evidence Portal.
If the evidence rating for a program is ‘supported research evidence’ or ‘promising research evidence’, it means the evidence for those programs shows they can have a positive impact on client outcomes. We can be reasonably confident in programs identified as having ‘supported research evidence’ as these programs have had multiple evaluations that all show the positive impact of the program.
However, it is important to remember that ‘one size does not fit all’. Look at the target group for each program and see if it is similar to the clients you work with. For example, do not assume that a program that has been evaluated with Anglo-Celtic families will work for Aboriginal families.
Before you implement a program you should review the assessed needs of your clients, their goals, and the resources you have available.
Programs can have mixed evidence if at least one client outcome was positive, another was neutral and/or another was negative.
Programs with a combination of positive and neutral outcomes are rated as ‘mixed research evidence (with no adverse effects)’. Caution should be used in implementing these programs also. You should carefully review the client outcomes the program can have a positive impact on and the outcomes it is unlikely to achieve.
A program is given the rating ‘evidence fails to demonstrate effect’ if an evaluation shows that the program did not have a positive or negative effect on client outcomes. These programs are not included on the Evidence Portal.
While the program may not be effective in the specific context in which it was evaluated, information about that program could still be useful to help us understand what does and doesn’t work for our clients. If the evidence shows that a program has no benefit, then it is recommended to consider alternative programs or activities.
A program is given the rating ‘evidence demonstrates adverse effects’ if an evaluation shows that the program only had a negative impact on client outcomes.
It is not recommended that these programs are implemented. They have not been included on the Evidence Portal.
It is important that the information on the Portal is not static and is updated to reflect changes to the evidence. As such, reasonable efforts will be made to ensure the program summaries and core components are reviewed regularly and updated as more evidence becomes available.
We will re-run searches on particular topics to identify newly published research and will incorporate this into the Evidence Portal over time.
No. We know that there are gaps in human services evidence.
Not all programs and activities have had the benefit of an evaluation or have been included in a research study.
It is hoped that the Evidence Portal can help to fill this gap, and build the understanding of what works for families and communities in NSW.
The evidence reviews we conduct to populate the Evidence Portal identify programs and activities from all over the world.
There are, however, few high-quality evaluations of relevant programs and activities that have been conducted in Australia.
The Evidence Portal aims to make research evidence readily available to busy practitioners in an easy to understand format.
In the near future, there will be resources to support organisations to design or adopt and implement new services and activities.
For help applying this evidence check out our Using evidence page or email EvidencePortal@dcj.nsw.gov.au
We want to keep the information on the Evidence Portal relevant and useful.
If there are any specific research topics you would like to see included on the Portal, please email: EvidencePortal@dcj.nsw.gov.au
02 Aug 2024
We acknowledge Aboriginal people as the First Nations Peoples of NSW and pay our respects to Elders past, present, and future.
Informed by lessons of the past, Department of Communities and Justice is improving how we work with Aboriginal people and communities. We listen and learn from the knowledge, strength and resilience of Stolen Generations Survivors, Aboriginal Elders and Aboriginal communities.
You can access our apology to the Stolen Generations.