METHODOLOGY: How We Ranked the T-Schools

DQI Bureau
New Update



Dataquest compiled the seventh Dataquest-CMR survey scorecard on the best T-Schools in the country on the basis of a methodology and calculations vetted by research firm CMR.


The aim of this survey was to determine the top Technology Schools (BE, B Tech or similar level graduate technical courses) in the country and rank them on a list of parameters important for both students and recruiters.



The Dataquest-CMR survey was done in two phases.

Phase 1: Desk Research

In this phase, the CMR team so as to identify the list of 300 T-schools did an exhaustive desk research and leading 40 companies who would be invited to be a part of the survey. We screened colleges established post 2007 and the ones which were not offering a BE, B Tech or similar level graduate technical courses.

Phase 2


The T-schools and the companies short-listed in Phase 1 were approached by CMR team. For the T-schools both email responses and face-to-face interviews with the college representative (preferably the placement coordinator) were considered. HR heads of leading companies were contacted over e-mail to include the recruiter perception in the survey. The data was compiled on the basis of two-year objective data (academic years 2011-12 and 2012-13) provided by institutes and perception scores of the recruiters.


Out of 300 institutes, 108 institutes have responded to our survey out of which 76 are same as last years and 32 have newly entered in our survey. The research team from CMR carried out the validation exercise. The objective scores were obtained by evaluating the T-Schools against the following parameters:

  • Placements
  • Infrastructure
  • Academic Environment
  • Industry Interface

The weights were distributed as: placements (40%), infrastructure (15%), academic environment (25%), and industry interface (5%). The total weightage assigned to objective data was 85%. This weightage has been restricted to 75% for the institutes who are new entrants this year.

These parameters were further categorized into sub parameters. This was done in the following manner:


  • Percentage of students placed in all type of companies
  • Percentage of students placed in IT companies
  • Number of companies visiting campus in all type of companies per student
  • Number of IT companies visiting campus/per student
  • Maximum salary/per annum
  • Average salary of all type of companies/per annum
  • Average salary of IT companies/per annum


  • Computer to student ratio
  • Percentage of computers connected to Internet
  • Percentage of students that can be accommodated in hostel
  • Internet access in hostel
  • Percentage of P IV/latest configuration computers
  • Internet access in computer labs
  • Batch strength for PG course in engineering disciplines
  • Availability of digital/network in-campus library


  • Faculty/student ratio
  • Percentage of permanent faculty
  • Percentage of Permanent Faculty with PhD
  • Number of Patents obtained by the institute
  • Batch strength of PhD course in engineering discipline
  • Percentage of students passed with first division in BE/B Tech Degree
  • Average number of research papers in 2011-12 and 2012-13


  • Number of MoU signed by the industry
  • Average number of assignments in 2011-12 and 2012-13 for organizations
  • Affiliation or linkages with international institute/education body

HR Perception Score: The perception survey of recruiters was conducted through another questionnaire. Recruiters were asked to rate importance of various aspects while deciding which Tech School to visit for campus recruitments on a 5-point scale. Recruiters were also welcome to add any parameter of their choice. They were given a list of institutes, which they had to rate on a 10-point scale. The overall weightage given to the recruiter's response was 15%. However, institutes which have HR score this year for the first time, weightage of HR score for those institutes have been restricted to 5%.

The composite score, which is the total of the objective data score and the recruiter's perception score, was used to arrive at the final ranking.


All the absolute data provided by the institutions were normalized and converted to relative data to evaluate the relative distance among the institutes across different parameters. After normalization the data had been multiplied with respective weights of each parameter and then the total objective score out of 85 was calculated. Similarly from the HRs' perception, the total score out of 15 was calculated only for those institutes that received some marks from the HR professionals otherwise the HR portion was kept blank. These two scores were then added up and composite score was calculated out of 100. The whole data was then arranged in descending order and we finally got the Top institutes across India.


Every year, we experience few reshuffling of ranks mainly due to the reason that this survey is based on hard facts rather than perception only. As a result the total score of a particular institute gets modified and altered every year.

Validation Process

CMR research team did all the data validation exercise. We had asked for few documents from the institutes like list of companies visited in the campus, number of students recruited by each one of them and the salary offered to the students, placement brochure, institute prospectus, companies for whom consulting/industrial assignment has been done, name of journal where research paper has been published and official copy of semester results. Apart from these we had also validated the data from various secondary sources. Especially the placement data have been validated from the data provided by the various companies who generally visit campuses for placements.