Service Level Management

An important part of the definition of ITIL Services is the understanding and agreeing of the correct levels of service for the customers of that service. Using a Service Lifecycle approach with Ivanti Service Desk enables you to guide the Service Level Manager through the correct steps as a function of defining and managing each individual Service.

To enable the requirements gathering, agreement creation, and targets setting required for mature Service-Centric service level management, your Portfolio process contains actions to:

  • Review and agree Service Level Requirements (SLRs) and Service Level Targets (SLTs) with customers. Collections built during the Service Lifecycle are used to capture multiple SLRs and SLTs; clicking the actions Add Service Level Requirements enables you to capture this information.
  • Create Service Level Agreements – with agreement scope, creation date, contacts and targets created as collections on the Service window. For each agreement, select the Customer, User or Third Party from the customer/user list. Once selected, click the hyperlink on the list to access the full Customer details. Users with the appropriate privileges can view and modify all of the Customer groups, the End-users, the Third Party groups and the Third Party Contacts using the Administration component.
  • Produce an SLA document. Typically this is manually produced and attached to the Service CI.
  • Review agreement content with the Service Level Manager to confirm all agreements are captured and stored, and updates have been applied in Service Desk – for example Response Levels and Escalations – to support the agreements. The Service-CI window displays fields that describe the typical broad levels of service provided, including the Service Hours.
  • Build Service Quality Plans (SQP) (detailing all measurement methods and inputs into monitoring Service performance) – usually attached as documents attached to the Service Portfolio process.

You can access the complete Service Level information from the Service CI, and also from the Service Level Management component in Administration. In either case, if you have the appropriate privileges, you can modify all of these values and records – including SLA, OLA, UC and the SLTs described above. However, we recommend that you set a Service Review date to identify when you need to review the agreements, and then modify your SLA query to group by Date to show a schedule of future service agreement reviews.

To see more detail on the internal and external service providers, expand the User Management tree in the Administration component. To see a list of external providers, expand the Suppliers folder. To see internal providers, expand the Support Group folder.

The selected Service Provider is visible on the Service Window. To see a list of the providers of Services, add the Service Provider field to the Service Portfolio query.

Design Idea: Create a separate Change process linked to your Service CI to provide a full and complex service review activity.

Design Idea: A query of Service-CIs showing the collections of agreements on each Service enables you to quickly click through and view each service and its associated SLA, OLA and UCs without needing to drill down into each one.

If you are building your Service Agreements in this way, you could also import on a scheduled basis from the Service Agreement tables into the Response Level Matrix to keep your automatic setting of Response Levels on processes aligned with your Service Lifecycle changes.

You may not be able to measure all of your SLM requirements through Response Levels. Typical SLA documents confirm a number of dimensions including Service Uptime and Availability. You may need input from Availability Management and Event management to track these in Service Desk. Using reports to bring multiple inputs from other activities together when you are tracking performance against complex Service Level Agreements gives valuable information.

For example, standard Crystal Reports include one providing availability measures, and another providing Response Level performance. Run these separately or combine them as sub-reports in Crystal Reports with your Response Level reports to see a view of both availability and responsiveness, showing overall performance against Service Level targets. An example of this is the KPI Summary report (available from the Ivanti Community), where MTTR and MTBF metrics are combined with response level information.

Design Idea: You can monitor Service Level Agreements visually or in a report in a number of ways, in varying complexity, based on your implementation of Service Levels.

  1. Monitoring SLA response performance against agreements by both Service and Customer is most easily delivered through the standard Service Level Agreement Monitoring (SLAM) report – SLAM Chart.
  2. Combining multiple inputs, for example where availability needs to be considered alongside response times combined with other external factors may require initial a Service Level performance review in Service Desk where an Ahead, On Track and Behind value is set on each current agreement.
  3. Remember the use of Ivanti MI as an aggregator of multiple feeds over time. MI can take input from Service Desk performance and external sources and present their daily metrics in graphical form.

When producing Service performance reports, as well as by Service and by Customer, the same information can be invaluable in assessing the performance of your third party suppliers, or services that are fully supplied externally, such as some cloud services. The same measuring of availability and events can track how reliable and performant these external services are.

Best Practice: It is easy to become focused on the internal IT activities to achieve agreed SLA performance, and to assume that if these agreements are being delivered upon then IT is always doing a 'good job'. Sometimes the level of performance and IT service may match the agreement, yet still fail to satisfy the customers and end users. This can happen over time as the business expectations and needs change faster than the agreements in place with IT. Use the Survey ability of Service Desk processes to monitor on-going Customer Satisfaction. The default Incident process includes Survey actions and objects to enable each end user to record their current satisfaction with IT. Also consider using the standard Complaints process to enable your customers to record a Complaint – or compliment – through Self Service.

In support of this activity it is also important to ensure that you regularly perform Service Reviews. This is implemented by way of a Task process on the Service Portfolio process, enabling the management of Service Reviews throughout the lifecycle of the Service. Proposed review dates are specified on the Chartered collection of the process and can be used to trigger the creation of the Task record.