A.3.1 The Tax Office sets itself performance standards for the delivery of PBRs. It assesses timeliness and quality. It also measures taxpayers' satisfaction with 'professionalism attributes'.
A.3.2 The timeliness performance standards reported in the Commissioner of Taxation's annual reports concern all applications for 'private written binding advice' made to the Tax Office. This includes applications from individuals and small businesses and other classes of binding advice that are not considered PBRs for the purposes of the Taxation Administration Act 1953.
A.3.3 These reported timeframes are calculated from the time at which the Tax Office considers it has received all the necessary information. The benchmark is set for either a 28-day period or an extended period where negotiated with the PBR applicant. It is likely that in most large business PBR applications, if not almost all, the Tax Office would have negotiated longer timeframes. This is because the issues considered in these PBR applications would likely be more complex, requiring more time and resources in which to consider and resolve the issues. The figures in the annual reports aggregate all private binding advice requests, not just those from large businesses. Large business requests comprise an extremely small percentage of all private binding ruling requests.
A.3.4 The Tax Office's internal reporting further breaks up the performance of private written binding advice applications into three categories: 'routine', having a 28-day period; 'complex', having a negotiated completion date; and those having been finalised within 90 days of receiving all information.
A.3.5 The figures provided in the raw data columns (in the table on the following page) are calculated on the basis of elapsed time from the date the Tax Office received the application until the date it was finalised. The figures may include periods in which the Tax Office was waiting on further material from the applicant. These figures only relate to PBRs and do not include other classes of private written binding advice.
|Annual report - all private written binding advice (a)||Finalised within 28 days or as otherwise agreed (as a %)||Benchmark||80||83|
|Large business - internal report on private written binding advice (b)||Routine cases (finalised within 28 days)(as a %)||Benchmark||83|
|Complex cases (finalised as agreed)(as a %)||Benchmark||83|
|Finalised within 90 days (as a %)||Benchmark||100|
|Large business - raw data on PBRs(c)||Finalised within time period (as a %)||28 days||14.2||21.1|
(a) The Commissioner of Taxation Annual Reports 2003-04 and 2004-05, Table 1.1
(b) Tax Office, LB&I Provision of Advice internal reports, Section 2.1, June 05 YTD.
(c) Spreadsheet provided by the Tax Office to the Inspector-General listing all large business PBR applications received or considered during 2003-04 and 2004-05.
A.3.6 The following two tables are sourced from the raw data provided by the Tax Office. They break up the numbers of, and average periods for finalising, large business PBR applications received and considered during the 2003-04 and 2004-05 years by outcome and whether the Tax Office needed to create a precedential view.
|PBR outcome||Type of Tax Office view needed||Total|
|Refusal to rule||52||4||56|
|Favourble to applicant||192||60||252|
|Unfavourable to applicant||28||20||48|
Source: Tax Office
|PBR outcome||Type of Tax Office view needed||Total average
|Refusal to rule||221||180||218|
Source: Tax Office
A.3.7 The Tax Office assesses the quality of its PBRs against performance standards through its technical quality review (TQR) process. Business lines within the Tax Office randomly select finalised cases involving decisions of a technical nature (interpretive decisions) to be reviewed under this process. Cases may include PBRs, other classes of administratively binding advice, audit decisions, penalty decisions and settlement cases. The TQR process is conducted on a regular basis by a panel of tax officials (senior officers experienced in technical decision making) and a 'community representative' (generally a person with experience in tax matters at a large-firm partner level or a person occupying a senior position in an academic institution and specialising in tax administration) who review the files relating to the decisions. They assess those decisions in accordance with the Tax Office's 'judgment model' and assess the compliance with the relevant Tax Office practice statements. Cases are graded in accordance with the quality of the decisions.
A.3.8 The TQR panel reviewed a range of LB&I written interpretive decisions that were finalised during 2003-04 and 2004-05. The benchmark was 85 per cent receiving an 'A' grading and 95 per cent receiving a 'Pass' grading.
|Period in which decision made||Quality of decision (%)||Compliance with
practice statements (%)
'D' and 'E'
|Sept 2003 - Feb 2004||98||91||8||0||100|
|Mar - Aug 2004||95||85||10||5||96|
|Sept 2004 - Jan 2005||100||96||4||0||100|
Source: Tax Office
A.3.9 As a result of this process the TQR panel highlighted to the LB&I area as areas for improvement; file management and communication with taxpayers.
A.3.10 The Tax Office has also recently sought feedback from PBR rulees through its Client Feedback Questionnaires (CFQs). During 2006–07, it received 42 responses (131 CFQs were issued) relating to written binding advice averaging a 'satisfaction rating' of 80 per cent (an overall average of answers to questions rating the Tax Office as 'very high' or 'high' with respect to proposed professionalism attributes). Certain PBR applicants were not asked to complete a questionnaire. These included those that withdrew their application, where the Tax Office refused to rule or invalidated the request and where team leaders or quality assurance officers excluded particular cases. The report for these results noted areas that attracted positive and negative comments and commented that:
The satisfaction rating for written binding advice products for 2006/07 was 80 per cent which is a slight increase compared to the 2005/06 result of 79 per cent. Seventy nine percent of advice cases achieved a satisfaction rating of 70 per cent or higher which is above the benchmark standard and within these results sixteen cases achieved a 100 per cent satisfaction rating. Overall, questions rated quite high, with the question relating to timeliness in finalising the advice showing an improvement compared to the previous year results.
CFQ results for advice products related to thirty six cases where the outcome was favourable and six cases related to an unfavourable or only partially favourable outcome. In relation to the unfavourable outcomes, it is important to note that three of these cases achieved the benchmark standard, with one achieving a satisfaction rating of one hundred percent.
On the whole, regular and open communication was a significant factor in achieving superior CFQ results and maintaining positive client relationships. The majority of client comments for advice products were very positive, particularly in relation to the accessibility and responsiveness of officers and the excellent service delivery. Specifically, the priority ruling system was highlighted as a very positive experience and the efficiency, professionalism and expertise of the teams was commended. The benefits of utilising Centres of Expertise was also noted, however there were comments made by both clients and case officers regarding internal process delays in this area.