Reports

ServiceDesk produces a plethora of reports.  In early 2010 we realized there was a need to comprehensively catalog and describe each of these in a single document.  This is our current state of resulting work.  At present (and though it’s the vision for it to be so), this document is not yet comprehensive.  In other words, it’s a work-in-progress, and significant further work is needed before it can claim the distinction of being a one place catalog/description of all ServiceDesk reports.

In the meantime, we can at minimum claim this document describes several of the major reports.  In particular, it describes (likely) most of the reports that involve analytics.

For context, many of ServiceDesk’s reports are scattered within several, contextually relevant operational venues.  None of those such scattered reports have yet had descriptions added to this document (it’s still a future project).

By contrast, there is a significant collection of reports accessed via a particular form that has no other purpose — except to be a locale from which to access that collection.  It’s called (and without any intended irony) the Reports form (accessed via shortcut F11).

A general note about reports in the Reports form is that, for virtually each, there is opportunity to export the raw, record-by-record data details on which the analytical summary figures are based.  These exports can be helpful if you wish to perform your own extended analysis or reporting, or perhaps wish solely to check the integrity of the analytical results as otherwise presented to you.  To produce those exports, after a report displays, look for an Export button in the Reports form’s bottom-right corner.

Please bear in mind there is a counterpart to this document whose design is to be a review/description of all the ex ports in ServiceDesk.  It may be found via a dedicated button in ServiceDesk’s Export Miscellaneous Data form (Shift-F3), or via this link.

Exports are distinguished from exports in that they simply output selected elements of data for you (typically via Excel file or similar).  By contrast, reports are designed to analyze data, compiling sums, ratios, making comparisons, and so on — to give you digested analytics.

The SalesSummary Report

Aside from a few of its supplementary figures, all data in this report comes directly from the SalesJournal (accessible in terms of its raw entries via the SalesRead form, quick key shortcut is F4).  In other words, to produce this report, ServiceDesk reads directly from applicable records within that file/journal, tabulates the results, and displays them to you.

The main body of the report is divided into four columns.  The first column tabulates the total of Paycode 1 and Paycode 2 SalesEntries, and is intended to display totals for work actually completed during the period (regardless of whether paid or not).  The second column tabulates the Paycode 1 and Paycode 3 SalesEntries, and is intended to show the total of work paid for in the period (regardless of when actually completed).  The third column is, simply, the total of Paycode 2s, and fourth of Paycode 3s (these columns are there for review if wanted, but it’s typically the first or second that you’ll pay most attention to).

The second section applies a series of adjustments to work out such matters as true total of money received, changes in A/Rs, and so on.  This adjustment process is needed because the SalesJournal, in and of itself, only reflects money received when the entirety of a sale is paid.  A/Rs, in turn, may be partially paid, but that fact does not show in the face of any SalesJournal entry.  To make the adjustments as applicable in this section (and arrive at the figures provided), ServiceDesk augments it’s reading of entries in the SalesJournal with reading of entries in the Applications Journal (quickkey shortcut is Alt-F9).

You’ll note there is an option to either display on-screen or print this report.  If you take the option to print, you’ll have a further option to include line-by-line entries, concerning each sale that went into the report.  That option simply is not present if electing to display on-screen.

There are also several Export options associated with this report—accessible via buttons that appear when the report is displayed.

One matter of occasional confusion concerns the section in the report where there is a distinction between “ServiceCalls” and “Tickets.”  Basically, each entry in SalesJournal represents a “ticket,” so far as any applicable column of display is concerned.  The intent is to classify the entry also as a “service call” if it is an entry reflecting in-field service (as opposed to POS activity), and if it’s the first (typically only) such entry as applicable on a given job.  In other words, we want to exclude (as defined “service calls”) entries if they involve going back (for recall or continuation work) after the initial work was supposedly completed.

The actual method that’s used in effort to achieve the above outcome is as follows:

  1. ‍If the S.Call amount column in the SalesEntry is unequal to zero, the entry is tallied as a ServiceCall.
  2. ‍If the Name column is in the form XX-Xxxxx (as in, say, WP-Smith), and if the Labor amount is at least $60, the system tallies the entry as a ServiceCall.  The logic here is that for warranty work it’s common to leave the S.Call field blank, and to put all labor in the Labor field.  Thus, if from the Name field it appears to be a likely warranty client, and from the amount in the Labor field it appears it was likely the entry on a job that reflected charging for the totality of the repair, the entry is tallied as an S.Call.

The CommissionsEarned Report

This report is provided for those who pay their technicians on a commission basis.  Like the SalesSummary, it reads (finds its data) directly in your SalesJournal.  It will apply whatever commission basis you have established in the EarningsRates form (quickkey is Alt-F2), as applicable to the tech on whom you are creating a report.

The screen-displayed version of this report contains summary data only, an in two columns.

Like the first two columns in the SalesSummary, these reflect: first, figures pertaining to work actually performed during the period (i.e., the total of applicable Paycode 1 and 2 entries); and second, of jobs paid for during the period (the total Paycode 1 and 3 entries).  You’ll want to pay the tech on whichever column reflects your payment policy.

Also (and still much like the SalesSummary), it you elect to print the report, you’ll have the option to include the line-by-line entries that went into producing the summary figures.  Usually, this is useful for allowing techs to verify they are indeed being paid on each of their jobs.

The WagesEarned Report

This report reads its data, simply, from the TimeLog.XX file, as applicable to the employee on whom the report is being created (such files are created for any employee when such employee uses the ServiceDesk ClockIn and ClockOut functions (with the “XX” extension, on the filename, being the two-letter abbreviation as applicable to the employee in question).

Like the Commissions Report, it applies whatever wage rate is established, for the employee, in the EarningsRates form (quickkey is Alt-F2).  It does not calculate withholdings (raw earnings only)—meaning it’s up to you (or a payroll service) to independently do the latter.

The AccountsReceivable Report

This report is available in two forms.  One tabulates all your A/Rs as a group:

The other provides individual breakdowns per individual HighVolumeClients.

Regardless of type of A/R Report you choose, the underlying machinery reads its data directly from your A/R file —the same data that may be reviewed, on a record-by-record basis, in the A/R - Read form (quickkey is F3).

The Profitability Report

This report lists each job as completed within a specific time frame, and for each shows three simple figures (revenue, job cost, and resulting margin).   The cost figure is based on a combination of parts used and labor inputs (figuring via user-provided inputs for per-trip and hourly costs for technicians).

The underlying mechanics are as follows:

  1. ‍The system begins by looking, one-by-one, at each SalesJournal entry (Paycode 1 or 2) that fits within the user-specified date range.  It tallies revenue amounts on this basis.
  2. ‍For each such entry, it seeks to find a matching JobRecord.
  3. ‍If from the JobRecord it appears the sale involved a POS situation (system looks in the job’s historical narrative for the phrase "(POS context)"), the transaction is excluded from the main tally figures (in such a case there will be a note at the report bottom that tallies POS items separately).
  4. ‍It tallies quantity of trips, on each job, by reading in the narrative job history.
  5. ‍It tallies time on spent, on each job, by reading in the narrative job history.
  6. ‍It tallies cost of parts used by searching for job-matching entries in the PartsProcess file, archived-PartsProcess file and InventoryJournal (quickkey entries for direct review of these contexts are, respectively, F8, Ctrl-F8 and F10>Review-Purchases-and-Usage).
  7. ‍It tallies LaborCost on the basis of user-provided trip-cost multiplied by quantity, then adds user-provided hourly-cost multiplied by time spent.

Please note that the underlying file, as simultaneously created when this report compiles for you, has added data breakdowns (separating parts cost factors from labor costs factors, for example).  For such added detail, simply click on the button to open that file.

The Quality of Service Report (Performance Analysis – Clients)

This report produces a series of figures (amounts involved in sales, averages per job, recall rates, etc.) that help you assess the level of work that’s being done for each of your HighVolumeClients, and to compare these parameters between such parties, and as compared to your non-HighVolumeClient work, both at a group level and individually.

The underlying mechanics, as involved in producing the report, are as follows:

  1. ‍The system begins by looking, one-by-one, at each SalesJournal entry (Paycode 1 or 2) that fits within your specified date range.  It tallies sale amounts on this basis.
  2. ‍For each such entry, it seeks to find a matching JobRecord.
  3. If from the JobRecord it appears the sale involved a POS situation (system looks in the job’s historical narrative for the phrase "(POS context)"), the transaction is excluded from the main tally figures (in such a case there will be a note at the report bottom that tallies POS items separately).
  4. ‍It determines whether to tally each particular job as a recall by using the “key-word” method (i.e., it looks for “RECALL”, “RE-CALL”, “CALLBACK” or “C/B” in the job’s Description/Complaint box).
  5. ‍It tallies quantity of trips, on each job, by reading in the narrative job history.
  6. ‍It tallies time spent, on each job, by reading in the narrative job history.
  7. ‍It tallies quantity of days from start to finish, on each job, by counting the days between the job’s OriginDate and the date of last technician visit.

The Margin Analysis Report (Performance Analysis – Clients)

This report produces a series of figures (quantity of trips, quantity of time, etc., as compared to revenue) designed to help you assess profitability of work as connected to each of your HighVolumeClients, comparing between such parties, and to your nonHighVolumeClient work (again, both at a group level and individually).

The underlying mechanics are as follows:

  1. ‍The system begins by looking, one-by-one, at each SalesJournal entry (Paycode 1 or 2) that fits within the user-specified date range.  It tallies sale amounts on this basis.
  2. ‍For each such entry, it seeks to find a matching JobRecord.
  3. ‍If from the JobRecord it appears the sale involved a POS situation (system looks in the job’s historical narrative for the phrase "(POS context)"), the transaction is excluded from the main tally figures (in such a case there will be a note at the report bottom that tallies POS items separately).
  4. ‍It tallies quantity of trips, on each job, by reading in the narrative job history.
  5. ‍It tallies time on spent, on each job, by reading in the narrative job history.
  6. ‍It tallies cost of parts used by searching for job-matching entries in the PartsProcess file, archived-PartsProcess file and InventoryJournal (quickkey entries for direct review of these contexts are, respectively, F8, Ctrl-F8 and F10>Review-Purchases-and-Usage).
  7. ‍It tallies LaborCost on the basis of user-provided trip-cost multiplied by quantity, then adds user-provided hourly-cost multiplied by time spent.

Please note that, for margin figures to be accurate, the user-query-provided trip - cost and hourly - cost figures must, in turn, be accurate.  It raises the question as to how you arrive at such figures.  Our suggestion is, run the report once using whatever seat-ofthe-pants guess you wish, for these figures.  Run once, the report will provide total trips for the period and total on-the-job hours for the period.  Go to your financial accounting and find what your total expenses were for the period.  Figure half the total expense as trip cost, and divide by quantity of trips to get per-trip-cost.  Figure the other half as hourly/time cost, and divide by total hours to get hourly cost.  Then run the report again with these figures.

As another note, please observe that even if your provided cost-basis figures are not accurately, you’ll likely still have valid comparisons between one HighVolumeClient and another, and between HighVolumeClients and non-HVC work.

The Result on Dispatches Report (Performance Analysis – Techs

This is one of our newer Technician Productivity reports (introduced January 2012).  Its purpose is to provide some fairly raw numbers showing just what each tech has done with the dispatches given him (i.e., how many resulted in completions, how many in part orders, etc.).

The report’s output loads into Excel, and on that basis takes advantage of greater width availability than can display well in this manual.  Nevertheless, here’s a shrunken image to give you some idea of what’s involved:

The report’s methodology is as follows:

  1. ‍This system reads within your Archived-ScheduleList, finding all appointments that fit within the requested date range.
  2. ‍It iterates through that set of appointments, once for your operation as a whole, then for each tech in your current roster, then for any appointments that have not tech assignment, or whose assignment is to other than a current-roster tech.
  3. ‍As it works with each particular appointment, it uses the appointment’s CheckOff symbol to deduce whether the appointment should be deemed the job having been “ Completed ” (Heart symbol), in a customer “ No - Show ” (Diamond symbol), or otherwise.
  4. ‍If the CheckOff symbol is otherwise (i.e., not a Heart symbol or Diamond symbol), the system examines the narrative history within the applicable JobRecord (assuming the same can be found, which in virtually all cases should be true) to see whether parts were ordered; if so, the appointment is scored in the “ Parts Ordered ” category).  If in this mode (i.e., the appointment did not have either a heart or diamond symbol) and there is no evidence in the narrative history that parts were ordered, the appointment/dispatch is scored within the “ No disposition ” category.
  5. The system iterates through as per above three different times.  First it does it for all appointments (blue section in the above-illustrated output).  Then it does for appointments that, based on the system looking in each item’s relevant JobHistory, it is able to deduce had a prior-fulfilled appointment (pink section in the above-illustrated output).  Finally, it does it for those appointments where there was not basis, upon reading an applicable JobHistory, to conclude there had been prior appointments (green section in above-illustrated output).
  6. ‍Based on the quantities and percents as tallied via each of the above-described iterations, the system enters resulting data into an Excel spreadsheet in the pattern as shown above.

The Percent of Completion Report (Performance Analysis – Techs)

This report produces a set of figures that help you assess, on a comparison basis between technicians, how well each is doing in terms of completing on the first visit, versus second, versus third, versus requiring four or more trips for completion.  It also allows a comparison of comparative averages, total quantity of jobs completed, etc.

Please note how the leftward graph allows you to visually compare (and at a glance) how your techs are comparing in regard to needing more than one trip, or not (you can easily see via comparative yellow/red/blue bands, for example, that BB is comparing poorly).  The two rightward graphs (cyan and violet) similarly allow at-a-glance comparison of how the techs compare on average trips-per-job and average days startto-completion.

In the case of all graphs, it’s also easy to compare with company-wide averages and numbers, as shown in the top/red section.  Indeed, you can think of the company-side values as providing a “par” figure, against which each tech can be compared.

Methodology for this report is as follows:

  1. ‍The system reads within your archived JobRecords, beginning at the most recent, and working toward the oldest.
  2. It continues reading in such succession until having either: (a) reached record position 1; or (b) encountered a weighted-basis quantity of records that are older than your specified date range.
  3. ‍The determination of whether a job fits within the specified date range is based on its OriginDate.
  4. ‍For each job that’s found to fit within the date range, the system tallies quantity of visits by reading in the narrative history.
  5. ‍For the main section as applicable to each tech, it determines which tech the job should be credited to by looking, in the narrative history, to see which tech was there first.  The theory is a different tech might be called upon to finish a job that a less competent one failed to (but should have) finished with fewer trips.  It’s not the tech who finished, but the one who should have finished earlier that should be charged with multiple trips.
  6. ‍For the little final-line section as applicable to each tech (i.e., showing Total Completes and Avg Completes/Business Day ), it concentrates instead on the last tech who performed on the job.  It would not seem sensible, simply, to credit a mere initiating tech with job completions.
  7. ‍For the last figure in that final-line section (i.e., showing Avg Days from Start To Finish ), the system tallies only those jobs where it’s one and the same tech who was there for both the first and final visit.  The thinking here is that, for such a figure, if different techs were involved, to tally this figure to either one of them.
  8. ‍Please also note that, in regard to that final figure (showing Avg Days from Start To Finish ), we are counting days between first visit and final visit —not between the date the job was written and the final visit.  Since this is a measure of technician performance, there is no reason to include time between when the customer requested service and the date of the first visit.

In respect Average - Completes - Per - Business - Day , please note the first figure is derived on the basis a five-day work week.  In other words, the system looks at the date range involved, and figures how many standard week days (Monday through Friday) fit within that period.  It’s on the basis of that figure that it calculates the quantity of standard “business” days, using that figure as the denominator to calculate completes (for the technician who did the complete) per such standard business day.

In regard to Average - Completes - Per - Day - Worked , by contrast, the system’s method is to count, as a day worked by any particular, any day in which there’s an entry in an applicable JobRecord showing that he completed a work visit, on a job, on that day.  No others days are counted.  For the operation as a whole (i.e., “ALL” in the tally), any day on which any tech worked (according to the above-described criteria) is counted as a work day.

Base on the above, you may note some interesting comparisons.  Some of your techs may show higher completes per business day than per day worked.  For those, it’s evident they must have worked more days than are involved in the measure of standard business days.  Other techs may show higher completes per days worked than per business day.  For those, it should be evident that, for the period in question, they worked fewer days than are involved in the measure of standard business days .  Another interesting factor; you’ll likely find less variation, among the techs, in completes per days worked, as opposed to completes per business day.

The Techs Time On Job Report (Performance Analysis – Techs)

This report produces a set of figures that help you assess, on a comparison basis between technicians, how well each is doing in terms of arriving at and departing from jobs within (and preferably toward the front portion of) scheduled time frames.  It further allows a comparison of total time spent on jobs.

Please note the graphic provided in each section.  The red rectangle is intended to denote average window size (i.e., of the appointment window for which the tech is scheduled).  The green box denotes the average amount of time spent by the tech, per job, and where it fits, time-wise, within the larger appointment window.

You can tell at a glance, for example, that CP (above) is doing badly, in terms of where he’s positioning his on-site times compared to appointment windows.  BB, on the other hand, is doing much better, and he’s very quick (short time on each job) as well.  EB is doing the best, in terms of having his on-site time toward the front of appointment windows.  On the other hand, he is being given much larger appointment windows (as compared to the other guys) to work with.

Please also again note that the top/red section shows figures and graphs for the operation as a whole.  In part, this provides a “par” standard against which individuals may be assessed, but it’s also a useful measure on the company as a whole.  In this particular case, indeed, a quick glance at the company-wide/top-section graphic shows that, overall, tech on-site times are quite late as compared to appointment windows.  Knowing how much customers appreciate having techs on-site early within their appointment windows (and hate it otherwise), this is something that, as an owner/manager, I’d want to strongly address.

Methodology for this report is as follows:

  1. ‍The system reads within your archived JobRecords, beginning at the most recent, and working toward the oldest.
  2. It continues reading in such succession until having either: (a) reached record position 1; or (b) encountered a weighted-basis quantity of records that are older than your specified date range.
  3. ‍The determination of whether a job fits within the specified date range is based on its OriginDate.
  4. ‍For each job that’s found to fit within the date range, the system looks in the narrative history to find entries that: (a) describe a tech’s visit; and (b) include his start and end times.  (Please note, if there are “ XX rsppndd ”-type entries, that do not include start and end times, these will not be included in this report).
  5. ‍For such entry as found, the system looks to a preceding entry that describes the time-frame scheduled.
  6. ‍Upon finding any such appropriate pairing (i.e., one entry describing the time-frame scheduled, and another describing the times when the tech was actually there), the system then tallies appropriate comparisons, and compiles for presentation in the report.

The Techs Revenue Report (Performance Analysis – Techs)

This report produces a set of figures to help you determine, on a comparison basis between technicians, how well each is doing in terms of producing revenue.

There is, again, a top/red section for company-wide values, and a selection of graphics to help with at-a-glance comparisons.

In particular, the top section provides company-wide (or ”par”) geometries for each measure, with each tech’s particular measure purposely arranged to allow easy direct comparison.  At a glance, for example (green graphs), you can see that three of the techs (RA especially) are performing well above-par in regard to total sales.   Their average totals-per-work-day (yellow graphs) mirror the same fact.  However, one of the techs who’s strong in total sales, is not so strong in average total-per-ticket (AV, blue graph).

The leftward purple/violet graph is particularly interesting in its ratio-type comparison between labor and materials sold.  Glancing at this graph in AV’s section may give an immediate clue as to why his average total-per-ticket is below par.  It appears, simply, that in comparison to others he is underselling on parts.  Perhaps that is all he needs to amend.

Methodology for this report is as follows

  1. ‍The system reads in your SalesJournal file, and determines the range of entries that fits within your date-range specification.
  2. ‍For each entry within that range, it uses the Technician field to determine the tech to whom the sale should be attributed, and tallies accordingly.