Hey there! I'm currently exploring ways to benchmark the performance of Rockwell FactoryTalk View SE software across different versions and operating systems. After setting up this software in various environments for clients in British Columbia, the question that often arises is whether newer versions offer improved speed. I'm curious if anyone has suggestions on how to create an HMI display that can consistently assess the performance of an HMI across various versions and operating systems. My current concept involves creating an HMI display that loads 30 pop-ups, aborts them, writes text files to the disk using VBA on each page, and retrieves complex derived tag values while a stopwatch runs in a docked menu bar. Have you ever attempted anything similar to this? I would greatly appreciate hearing about your experiences or any ideas for constructing a tool like this. Your insights would be invaluable. Thank you.
I have recently created a GFX file to monitor CPU and Disk Write Speed due to limited options available. My plan is to expand this file to include Disk Read Speed, various gauges, and performance metrics. This tool can be valuable when diagnosing and troubleshooting performance issues for system integrators. The file was developed in FTView 13 SE and the disk write speed test involves creating a text file on the c:\ drive, so ensure that View has the necessary permissions to write to that location.
The tool has been recently updated to launch and close 100 displays, providing a measurement of time difference. For detailed installation instructions, refer to the .zip file.
It's great to see someone pushing the boundaries of FactoryTalk View SE performance testing. Running the test with mentioned HMI display sounds like a good start, but you should also consider network latency, particularly if the clients are using distributed HMI screens over a wide area network. Try comparing the loading time of heavy graphic displays - insert and then load more graphic objects (analog, digital displays, graphs), and note the time difference between different versions and operating systems. Also try a tag update rate test - associate thousands of tags with display objects and track the updates per second. Another possible test scenario involves checking alarm and event performance. It's easy to underestimate their impact on overall performance. Also, take into consideration the resource consumption by SE Server itself - memory usage and CPU load may vary depending on the version and OS. Don't forget to keep your testing environments as identical as possible outside the specific variables you're testing to ensure accuracy. Good luck, and I'm looking forward to hearing about your findings!
Your idea of devising a benchmarking procedure sounds quite intriguing. I don't have direct experience with FactoryTalk View SE, however, in my estimation, the method you're conceptualizing should provide a generally good measure of performance. I would just suggest to make sure you control as many variables as possible during your tests. This way, you can be sure any improvements or discrepancies in speed are actually due to the version and not some external factors. What I found helpful in similar situations was using a common dataset and set of operations that stressed the capabilities of whatever I was testing. I'd also recommend taking into account not only the speed but also the quality of the display, especially for HMI, since interface issues can often detract from functional speed improvements.
Your current concept sounds like a very comprehensive approach to testing the performance of the software across different versions and operating systems. I worked on a similar project a few years back and found that incorporating additional queries to the SQL database and testing the loading times of larger graphic displays provided a more accurate depiction of real-world use. Furthermore, it's worth considering factors like network speed and RAM as these can also influence the software's performance. Lastly, I would strongly recommend documenting and sharing these findings with Rockwell, as it could provide valuable feedback for their development team and potentially lead to performance enhancements in future product iterations.
Hi there! I've undertaken similar projects in the past and I found that, to estimate real-world activities better, you ideally want to have your benchmark layout testing more than just window load times and file I/O. Including tasks like reading and writing to PLCs, executing scripts, or handling alarms could provide further insight. As for timing, I'd suggest leaning on the inbuilt system tags that SE provides rather than a VBA stopwatch, which may drop a few milliseconds here and there. Lastly, remember to repeat each task a few times for accuracy; multiple runs can help you to measure consistency better. These are just a few suggestions based on my experience. Happy benchmarking!
✅ Work Order Management
✅ Asset Tracking
✅ Preventive Maintenance
✅ Inspection Report
We have received your information. We will share Schedule Demo details on your Mail Id.
Answer: - One way to benchmark the performance is by creating an HMI display that loads pop-ups, writes text files, retrieves tag values, and runs a stopwatch to assess speed across versions and operating systems.
Answer: - The question of whether newer versions offer improved speed is a common concern when setting up the software in different environments. Benchmarking performance can help determine if there are performance improvements.
Answer: - If you have attempted to create a tool to assess performance by loading pop-ups, writing text files, retrieving tag values, and running a stopwatch, we would love to hear about your experiences and insights on constructing such a tool.
Answer: - Key considerations when benchmarking performance include the number of tasks performed, the complexity of operations, and the consistency of the testing methodology across different versions and operating systems.
Join hundreds of satisfied customers who have transformed their maintenance processes.
Sign up today and start optimizing your workflow.