How it Works
How does Triometric Analyzer work?What is passive packet capture?Large web apps and large websitesTechnology comparison
How does Triometric Analyzer work?
Triometric Analyzer monitors real end users of web-based applications, traffic to web sites and transactions to web services systems. It analyses usage, errors and performance.
The technology is typically comprised of two servers, one for the Monitor and one for a combined Manager and Reporter.
The Manager provides enterprise class data storage, sophisticated data processing and web based reporting.
The Monitor uses passive packet capture to view HTTP, XML and SSL packets and dynamically decrypts SSL-encrypted packets.
This advanced protocol analysis allows an in-depth reading of information in the IP, TCP, SSL,HTTP and XML protocols.
Each packet is analysed in real-time and tracked with reference to its transaction.
When a transaction completes, the monitor writes a log record. Periodically, the Monitor connects to the Manager and transfers the most recent set of statistics.
What is passive packet capture?
Passive packet capture is the gold standard for real user monitoring.
Triometric pioneered the use of this technique in the web application context and has more experience with it than any other provider. 'Passive' means monitoring is carried out unobtrusively, without adding load to network or servers. 'Packet capture' means the software picks up real network traffic and analyses actual end users or transactions to arrive at its results.
What does Analyzer measure?
Real user transactions are captured 'on the wire'. Core measurements include:
- Traffic volumes: users and visits; pages and objects served
- End user behaviour: application objects, time and volume of use, location
- Response time: end-to-end across the network, and from server to network
- Errors: application errors, protocol errors, Triometric-defined errors and aborted transactions
- Content: bytes and packets sent and received by the servers
The information is analysed and related to the time period of the transaction, the URL requested, location, access method and individual user ID of the end user, and the server, server group and domain involved. In addition to core measurements, specialised versions of Analyzer provide additional in-depth analysis of traffic content. This includes application traffic and XML based transactions.
Where is Analyzer installed?
Analyzer is made up of three software components:
- Monitor performs the packet monitoring, content capture and response time measurement
- Manager collates, processes and stores the information produced by the Monitor
- Reporter generates and presents reports
The Monitor is installed either on the same LAN as the web servers or on a nearby LAN. The Manager and Reporter are installed in any location that allows IP connectivity to the Monitor. Technical and business personnel can view Triometric reports by connecting to the Reporter using a standard web browser or viewing scheduled reports deliver to their inbox. A wide range of reports are included as standard and more technical users can customise queries and drill down to details of individual real users and transactions.
Large web apps and large websites
In busy enterprise and e-commerce applications, traffic crosses the network at extremely high speed, supported by powerful server farms and very high speed communications links.
One of the most important design principles of Triometric Analyzer is to cater for very high-volume sites and web applications, achieved through:
- Early packet-filtering
- Multi-process software architecture
- Data structures and algorithms to ensure that each packet is analysed as quickly as possible
- Avoiding long-term storage of packets where possible
Standard Triometric Analyzer configurations have been successfully deployed with applications receiving more than 100 million transactions per day. However, for super-large capacity sites and applications, multiple Monitors can be installed with their measurements collated and processed as if from a single Monitor.
Other methods of measuring web application user behaviour and performance rely on techniques that result in issues such as loading production servers or tagging traffic which are not present with real user passive packet capture.
Measurement software is installed into the client system. The software watches transactions, measures response times and periodically reports to a central management system. There are two primary drawbacks: the resources needed for monitoring and reporting to the central system impact on other processes on the client; and the need to manage the remote software makes this technique completely unsuitable for a large distributed user base on the public internet.
This technique is used by some Citrix monitoring providers. Measurement software is installed into the Citrix server. The drawback is its negative impact on server performance.
Log file analysis
Servers maintain logs of their activity. When analysed, these reveal some of the information passive packet capture yields, but not all: for example, log files can't show how fast the server delivers content to end users, which is one of the most important performance metrics. Log file analysis also tends to be very time consuming, and maintaining the logs takes a great deal of storage.
Application based logging solutions
Application based logging solutions load the live servers, increase the response time for every request, collect server only data, divert development, support, maintenance resources away from core revenue activities, are constrained by change management processes and aren’t readily extensible to monitor additional services.
Remote agent monitoring
Usually outsourced, it does not monitor real users. Instead it uses a remote appliance to simulate end users, transmitting a sample of scripted requests to the application on a regular interval. Response time is measured, but since the remote appliances tend to be located near internet hubs, research shows the time is not representative of response time experienced by real end users. It is also unable to discern problems and errors experienced by real users, since the scripted transactions are only an approximation of the actual pattern of end user activity and for reasons of cost can't begin to sample all possible end user actions. Finally, comprehensive monitoring using this technique adds significant load to the server, adversely affecting performance and the experience of real users.