Skip to main content

IT Project Manager

 

What is an IT project manager?

An IT project manager is a professional charged with overseeing the process of planning, executing and delegating responsibilities around an organization's information technology (IT) pursuits and goals.

IT project managers may work in a variety of industries, as nearly all organizations rely on computing technologies. Some organizations establish IT project management offices (PMOs) to guide the completion of large-scale initiatives.

What does an IT project manager do?

An IT project manager should support business directives associated with specific initiatives and assign resources based on those goals. As project participants are often spread across different offices and teams, the role of the IT project manager is to ensure that projects are delivered smoothly on-time and on-budget with minimal interruptions in work.

The types of projects an IT project manager may be responsible for include the following:

  • software development
  • mobile application development
  • web development
  • database management
  • backup and recovery
  • cloud migration
  • software implementation
  • hardware installation
  • network configuration
  • infrastructure management

IT project manager tasks may include the following:

  • project planning, setting goals, project milestones and completion plans;
  • maintaining schedules and budgets for each project;
  • managing team members;
  • distributing tasks to project team members;
  • presenting project plans;
  • tracking the progress and performance of team members;
  • assessing and taking proper action to account for risk; and
  • leading meetings between teams and other stakeholders.

How do you become an IT project manager?

IT project manager positions typically require a mix of technical and soft skills. While a strong technical background is necessary, job descriptions also ask for non-technical skills such managing tasks, schedules and providing detailed plans.

Education requirements. To be considered for an IT project manager position, candidates should have at least a bachelor's degree in computer science, IT or a related field. While some positions may accept candidates with an associate degree or the equivalent experience, a bachelor's degree is typically preferred.

Advanced degrees in business management or professional project management certifications, such as project management professional (PMP) or the Certified ScrumMaster from Scrum Alliance, may be required or preferred by employers.

Skills. Employers are generally looking for candidates with the following technical and soft skills:

  • an advanced knowledge of computers, computer systems, software and network technology;
  • communication and leadership skills;
  • analytical problem-solving skills;
  • proven project management skills;
  • familiarity with one or more project management methodologies; and
  • organization and time management skills;

IT project management methodologies

IT project managers often use project management methodologies or frameworks to guide practices. Popular project management methodologies used for IT projects include the following:

  • Agile. This framework relies on short delivery cycles. It is often employed for projects where speed and flexibility are prioritized.
  • Waterfall methodology. Work flows sequentially between defined phases and work stations. In the Waterfall model, work only moves to the next phase after completion of the previous phase.
  • Scrum. Scrum places a focus on transparency, inspection and adaptation. Scrum encourages iterative progress, accountability and teamwork. Work is broken down into short "sprints."
  • PRINCE2. Involves lots of early-stage planning. This project management framework combines practices from a variety of backgrounds and industries.
  • Traditional project management. Draws principles from the Project Management Body of Knowledge (PMBOK) guide, designed around three phases of a project: inputs, tools and techniques, and outputs.
  • Lean. Focuses on reducing unnecessary waste in resources and optimizing processes for efficiency.


Comments

Popular posts from this blog

Black swan

A  black swan event  is an incident that occurs randomly and unexpectedly and has wide-spread ramifications. The event is usually followed with reflection and a flawed rationalization that it was inevitable. The phrase illustrates the frailty of inductive reasoning and the danger of making sweeping generalizations from limited observations. The term came from the idea that if a man saw a thousand swans and they were all white, he might logically conclude that all swans are white. The flaw in his logic is that even when the premises are true, the conclusion can still be false. In other words, just because the man has never seen a black swan, it does not mean they do not exist. As Dutch explorers discovered in 1697, black swans are simply outliers -- rare birds, unknown to Europeans until Willem de Vlamingh and his crew visited Australia. Statistician Nassim Nicholas Taleb uses the phrase black swan as a metaphor for how humans deal with unpredictable events in his 2007...

A Graphics Processing Unit (GPU)

A graphics processing unit (GPU) is a computer chip that performs rapid mathematical calculations, primarily for the purpose of rendering images. A GPU may be found integrated with a central processing unit (CPU) on the same circuit, on a graphics card or in the motherboard of a personal computer or server. In the early days of computing, the CPU performed these calculations. As more graphics-intensive applications such as AutoCAD were developed; however, their demands put strain on the CPU and degraded performance. GPUs came about as a way to offload those tasks from CPUs, freeing up their processing power. NVIDIA, AMD, Intel and ARM are some of the major players in the GPU market. GPU vs. CPU A graphics processing unit is able to render images more quickly than a central processing unit because of its parallel processing architecture, which allows it to perform multiple calculations at the same time. A single CPU does not have this capability, although multi...

6G (sixth-generation wireless)

6G (sixth-generation wireless) is the successor to 5G cellular technology. 6G networks will be able to use higher frequencies than 5G networks and provide substantially higher capacity and much lower latency. One of the goals of the 6G Internet will be to support one micro-second latency communications, representing 1,000 times faster -- or 1/1000th the latency -- than one millisecond throughput. The 6G technology market is expected to facilitate large improvements in the areas of imaging, presence technology and location awareness. Working in conjunction with AI, the computational infrastructure of 6G will be able to autonomously determine the best location for computing to occur; this includes decisions about data storage, processing and sharing.  Advantages of 6G over 5G 6G is expected to support 1 terabyte per second (Tbps) speeds. This level of capacity and latency will be unprecedented and wi...