yourfanat wrote: I am using another tool for Oracle developers - dbForge Studio for Oracle. This IDE has lots of usefull features, among them: oracle designer, code competion and formatter, query builder, debugger, profiler, erxport/import, reports and many others. The latest version supports Oracle 12C. More information here.
Cloud Expo on Google News

2008 West
Data Direct
SOA, WOA and Cloud Computing: The New Frontier for Data Services
Red Hat
The Opening of Virtualization
User Environment Management – The Third Layer of the Desktop
Cloud Computing for Business Agility
CMIS: A Multi-Vendor Proposal for a Service-Based Content Management Interoperability Standard
Freedom OSS
Practical SOA” Max Yankelevich
Architecting an Enterprise Service Router (ESR) – A Cost-Effective Way to Scale SOA Across the Enterprise
Return on Assests: Bringing Visibility to your SOA Strategy
Managing Hybrid Endpoint Environments
Game-Changing Technology for Enterprise Clouds and Applications
Click For 2008 West
Event Webcasts

2008 West
Get ‘Rich’ Quick: Rapid Prototyping for RIA with ZERO Server Code
Keynote Systems
Designing for and Managing Performance in the New Frontier of Rich Internet Applications
How Can AJAX Improve Homeland Security?
Beyond Widgets: What a RIA Platform Should Offer
REAs: Rich Enterprise Applications
Click For 2008 Event Webcasts
End-User Monitoring: RUM or Synthetic?
Explore the advantages of Real User Monitoring (RUM) vs. Synthetic Transaction Monitoring

Performance for end-users is the metric by which most businesses judge their web applications' performance: is the responsiveness of the application an asset or a liability to the business?  Studies show that users are growing more and more demanding, while average pageloads are getting bigger and bigger-more than doubling in weight since 2010.  Combine that with frequent releases and updates from marketing, and pretty soon the optimization job is never quite done.

Ongoing monitoring application performance from the end-user's perspective is therefore critical; fortunately, there's a number of approaches to choose from.  But which one(s) are best?

Real User Monitoring

Bad news for end users!

While the server-side performance of a web application can be measured from looking at HTTP requests in your data center, the full pageload experience-downloading static assets from CDN, rendering the page, executing JavaScript-cannot be seen from that vantage point.  Real user monitoring is the practice of using JavaScript embedded in web pages to gather performance data about the end user's browsing experience, from the browser's perspective.

This is a great improvement for business' understanding of application performance.  The data gathered shows the full timing, based on real pages being loaded, from real browsers, in real locations around the world.  The technology applies to desktop, mobile, and tablet browsers equally well.

The biggest advantage of measuring actual data is that there's no need to pre-define the important use cases. As each user goes through the application, RUM captures everything, so no matter what pages they see, there will be performance data available. This is particularly important for large sites or complex apps, where the functionality or interesting content is constantly changing.

Thanks to advances in browser  APIs such as the Navigation Timing API, detail in RUM data is better than ever.  It divides the time spent in the browser into time spent building the DOM, and time spent until document.ready is fired. This is a great starting point, and especially for data that's captured comprehensively over all users, it's a great point of triage.  If a page was slow, why was it slow? Unfortunately, while RUM provides this starting point, it doesn't necessarily point to the precise asset.  Additionally, the growing trend of "single page apps"-apps which do not perform full pageloads to gather new data, like GMail or Facebook-do not yield very good RUM data.

Improvements on the horizon such as the Resource Timing API may improve the situation, but right now RUM is primarily useful for understanding whether problems exist anywhere within an application. It even gives some high-level triage - is the problem in the network, the application server, or the end user's environment? Beyond that, RUM can't tell the difference between a drop in traffic and a loss of network connectivity. Worse yet, an increase in RUM latency might indicate a degredation in backend performance, or it may just be a temporary increase in the use of a relatively slow report generation feature. To get this information, RUM alone isn't sufficient.

Synthetic Performance Monitoring

RUM vs Synthetic

Synthetic performance monitoring, sometimes called proactive monitoring, involves having external agents run scripted transactions against a web application.  These scripts are meant to follow the steps a typical user might-search, view product, log in, check out-in order to assess the experience of a user.  Traditionally, synthetic monitoring has been done with lightweight, low-level agents, but increasingly, it's necessary for these agents to run full web browsers to process JavaScript, CSS, and AJAX calls that occur on pageload.

Unlike RUM, synthetics don't track real user sessions. This has a couple important implications. First, because the script is executing a known set of steps at regular intervals from a known location, its performance is predictable.  That means it's more useful for alerting than often-noisy RUM data.  Second, because it occurs predictably and externally, it's better for assessing site availability and network problems than RUM is-particularly if your synthetic monitoring has integrated network insight.

Many companies actually use this sort of monitoring before getting to production, in the form of integration tests with Selenium. Synthetic transactions in production can actually re-use these same scripts (as long as they don't change data). As applications get more complex, proxy metrics like load or server availibility become less useful for measuring uptime. Running Selenium scripts against production isn't a proxy measurement; it precisely measures uptime, providing full confidence that if the synthetic transactions are completing, the site is up and running.

Finally, because synthetics have full control over the client (unlike the sandboxed JS powering RUM), the detail that can be garnered is staggering-full waterfall charts, resource-by-resource performance, and even screenshots/videos of the pageload in action to determine paint times.  This type of insight is currently the best way to understand the performance of state transitions in single page apps, as well.

The Winner?

Looking at synthetics vs RUM, there's a mess of tradeoffs.  Each seems to be better at certain aspects of performance monitoring than the other, so which one wins?

It may not be a competition after all, but rather two complimentary puzzle pieces-synthetics provide detail, reliability, and availability, while RUM provides a grounding in real user experience.  For this reason, we're of the mind that the best insight into performance comes from a combination of synthetic and real-user monitoring, and this is why AppNeta provides access to both, integrated into TraceView for maximum insight.

Synthetics + RUM = Crazy Delicious

TraceView users have been taking advantage of this integration for months, but it just got even better last week with the release of Synthetic-RUM comparison.  Now users can plot their real user traffic vs synthetics over time, understand differences in performance across regions, browser types, and synthetic scripts.

RUM vs synthetics.

This allows teams to ensure that their synthetic monitoring is grounded in real-world data, as well as view their synthetic performance in a new, global way.

RUM vs synthetics.

Best of all?  You can try it for free today!  Click here to get started.

About Dan Kuebrich
Dan Kuebrich is a web performance geek, currently working on Application Performance Management at AppNeta. He was previously a founder of Tracelytics (acquired by AppNeta), and before that worked on AmieStreet/

Latest AJAXWorld RIA Stories
"DivvyCloud as a company set out to help customers automate solutions to the most common cloud problems," noted Jeremy Snyder, VP of Business Development at DivvyCloud, in this interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY...
Founded in 2000, Chetu Inc. is a global provider of customized software development solutions and IT staff augmentation services for software technology providers. By providing clients with unparalleled niche technology expertise and industry experience, Chetu has become the prem...
DXWorldEXPO LLC announced today that Dez Blanchfield joined the faculty of CloudEXPO's "10-Year Anniversary Event" which will take place on November 11-13, 2018 in New York City. Dez is a strategic leader in business and digital transformation with 25 years of experience in the I...
In his session at 21st Cloud Expo, James Henry, Co-CEO/CTO of Calgary Scientific Inc., introduced you to the challenges, solutions and benefits of training AI systems to solve visual problems with an emphasis on improving AIs with continuous training in the field. He explored app...
"We work around really protecting the confidentiality of information, and by doing so we've developed implementations of encryption through a patented process that is known as superencipherment," explained Richard Blech, CEO of Secure Channels Inc., in this interview a...
Subscribe to the World's Most Powerful Newsletters
Subscribe to Our Rss Feeds & Get Your SYS-CON News Live!
Click to Add our RSS Feeds to the Service of Your Choice:
Google Reader or Homepage Add to My Yahoo! Subscribe with Bloglines Subscribe in NewsGator Online
myFeedster Add to My AOL Subscribe in Rojo Add 'Hugg' to Newsburst from CNET Kinja Digest View Additional SYS-CON Feeds
Publish Your Article! Please send it to editorial(at)!

Advertise on this site! Contact advertising(at)! 201 802-3021

SYS-CON Featured Whitepapers