Podelko Alexander
Consulting Member of Technical Staff - Oracle
Stamford, United States

Alex Podelko has specialized in performance since 1997, working as a performance engineer and architect for several companies. Currently he is Consulting Member of Technical Staff at Oracle, responsible for performance testing and optimization of Enterprise Performance Management and Business Intelligence (a.k.a. Hyperion) products.

Alex periodically talks and writes about performance-related topics, advocating tearing down silo walls between different groups of performance professionals. His collection of performance-related links and documents (including his recent articles and presentations) can be found at http://www.alexanderpodelko.com. He blogs at http://alexanderpodelko.com/blog and can be found on Twitter as @apodelko. Alex currently serves as a director for the Computer Measurement Group (CMG, http://cmg.org), an organization of performance and capacity planning professionals.

Attended conferences (1)
Talks (2)
  • 11.02.2018
    Performance Requirements: the Backbone of the Performance Engineering Process

    Performance requirements should be tracked from system's inception through its whole lifecycle including design, development, testing, operations, and maintenance. They are the backbone of the performance engineering process. However, different groups of people are involved in each stage, and they use their own vision, terminology, metrics, and tools that make the subject confusing when you go into details. The presentation discusses existing issues and approaches in their relationship with the performance engineering process.

    • Average
    • 40 min
    • SQA Days / 23
  • 11.02.2018
    Continuous Performance Testing: Myths and Realities

    While development process is moving towards all things continuous, performance testing remains rather a gray area. Some continue to do it in the traditional pre-release fashion, some claim 100% automation and full integration into their continuous process. We have a full spectrum of opinions of what, when, and how should be done in regard to performance. The issue here is that context is usually not clearly specified - while context is the main factor here. Depending on context, the approach may (and probably should) be completely different. Full success in a simple (from the performance testing point of view) environment doesn't mean that you may easily replicate it in a difficult environment. The speaker will discuss the issues of making performance testing continuous in detail, illustrating them with personal experience when possible.

    • Average
    • 40 min
    • SQA Days / 23
To leave a feedback you need to

or
Chat with us, we are online!