Upload
nur-faezal-elias
View
61
Download
5
Embed Size (px)
Citation preview
Network Performance Diagnostics Tools
1
ISSUES
1
2
3
4
5
6
7
8
9
INTRODUCING NetTrax 3000
OBJECTIVES
QoS APPLICABLE STANDARDS
TESTING REQUIREMENTS & CONFIGURATION
TESTING PARAMETERS
Q & A
WHAT WE OFFER
2
AGENDA
SAMPLE REPORT
COMPANY PROFILE
10
ISSUES
1
2
3
4
5
6
7
8
9
INTRODUCING NetTrax 3000
OBJECTIVES
QoS APPLICABLE STANDARDS
TESTING REQUIREMENTS & CONFIGURATION
TESTING PARAMETERS
Q & A
WHAT WE OFFER
3
SAMPLE REPORT
COMPANY PROFILE
10
4
• Established in 2003; inspired by Malaysia’s Vision 2020
• Led by a group of professionals with diverse background in
Electrical/Electronic Engineering, Information and
Communications Technology (ICT), Telecommunications and
Business Management
• Have been entrusted by Government and private agencies to
handle and manage High Impact, High Risk Technical
Consultancy Services
• Solid track record and proven experience to realize any task and
projects undertaken
• Proven work process methodology through tried and tested
procedures
• “Your Satisfaction Is Our Success” – a tagline that embodies the
spirit of every NFE Consulting employees
1 COMPANY PROFILE
5
HIGH PROFILE PROJECTS
High Speed Broadband (HSBB) Project
• Government Independent Consultant for High Speed Broadband Project
• Entrusted by the Malaysian Government to oversee and certify an RM 11.3 Billion project
• Processes Involved:• Certification on Financial Claims• Certification on Technical
Claims• Certification on Project
Processes and Management
1
6
INFOSTRUCTURE CONSULTANCY FOR MULTIMEDIA DEVELOPMENT CORPORATION
HIGH PROFILE PROJECTS1
7
Green ICT Working Group
• Assume chairmanship role for the Green ICT Working Group (GICT WG)
• Responsible for development of Green ICT guidelines to be used by industry
• Established 3 working threads for streamline of work process:
• Promotion and Awareness• Green ICT Solutions for Industry• Green ICT Metrics and
Measurements
GREEN ICT WORKING GROUP1
8
Malaysian Technical Standard Forum Berhad (MTSFB)
• Member of MTSFB• Responsible for the establishment
and maintenance of standards, technical codes, network interoperability and operation issues.
• Working Group (WG) involved:• Next Generation Network• Wireless Internet Broadband• Multimedia Terminal• Green ICT• IPv6
Next Generation Network
IPv6
Wireless Internet Broadband
Green ICT
Multimedia Terminal
INDUSTRY INVOLVEMENT1
9
REGULATORY BODY/ AGENCY/CLIENT PROJECTS
Malaysian Commission for Multimedia and Communication (MCMC)
Technical Consultant for PSTN Quality of Service AssessmentTechnical Consultant for Internet Broadband Quality of Service AssessmentTechnical Consultant for Dialup Internet Quality of Service AssessmentIndependent Consultant for High Speed Broadband Certification
Multimedia Development Corporation(MDEC)
Consultancy Study on Facilities and Utilities for CyberjayaAudit Assessment on Cybercities and CybercentresConsultancy Service, Study and Telco Services Compilation2G and 3G Cellular Walk Test for CybercentresMSC Building Guidelines Revision
Malaysian Electronics Payment System Sdn Bhd (MEPS
Turnkey Consultant and Solution Provider for MEPS Cash Transport
Kementerian Belia dan SukanMultimedia Consultant for Young Ambassador Award
Kementerian Keselamatan Dalam NegeriSecurity and Surveillance Technical Consultant
State of TerengganuTechnical Consultant for Wireless Broadband Access for Kuala TerengganuTechnical Consultant for Trengganu Internet Exchange
TRACK RECORDS1
10
REGULATORY BODY/ AGENCY/CLIENT PROJECTS
Time Engineering Berhad (TEB) Technical Consultant for Langkawi Wireless BraodbandTechnical Consultant for Wireless Broadband Initiative
Time dotCom Berhad (Time) Technical Consultant for Cyberjaya Wireless Broadband InitiativeTechnical Consultant for Wireless Braodband Access for Penang
MSCMS Sdn Bhd Technical Consultant for Cyberjaya Wireless Broadband InitiativeTechnical Consultant for Wimax and Wireless BroadbandTechnical Consultant and Solution Provider on GIS System for MOHA Brunei
M-Mode Multimedia Berhad Technical Consultant for Payment Gateway and Network Infrastructure for Jakarta Monorail Project
TM Applied Business Sdn Bhd (TAB) Technical Consultant for Anti Fraud Device (AFD)Technical Consultant for EasyNet
TM Payphone Technical Consultant for Static Fraud Problem
TM Research Sdn Bhd Technical Consultant for SIP VoIP TelephoneTechnical Consultant for WiFi SIP phone
TRACK RECORDS1
11
REGULATORY BODY/ AGENCY/CLIENT PROJECTS
Kementerian Kerja Raya Malaysia Road Traffic Info System
Tenaga Nasional Berhad Utility Survey and Mapping
Konsortium Jaringan Selangor Technical Consultant for Utility Survey and GIS Mapping
Felda Berhad Turnkey Consultant and Solution Provider for Attendance, Security and Surveillance Group wide
UEM Group BerhadTechnical Consultant for ENRICH ProjectTechnical Consultant for GISS ProjectTechnical Consultant for MYREN2 Project
TRACK RECORDS1
ISSUES
1
2
3
4
5
6
7
8
9
INTRODUCING NetTrax 3000
OBJECTIVES
QoS APPLICABLE STANDARDS
TESTING REQUIREMENTS & CONFIGURATION
TESTING PARAMETERS
Q & A
WHAT WE OFFER
12
SAMPLE REPORT
COMPANY PROFILE
10
13
EXISTING SCENARIO – CONVERGED
NETWORK2
VOICE
NETWORK
VIDEO
NETWORK
DATA
NETWORK
WIRELESS
NETWORK
CONVERGED
IP
NETWORK
• Networks are being transformed by multiple services, mixing data, voice, video and mobility
• Convergence enables cross-device applications (e-mail to TVs, video to mobile handsets, etc.)
• Each service has unique performance and management requirements
• Network problems multiply quickly and become harder to pinpoint, translating into revenue loss and customer churn
14
THE PERENNIAL PROBLEMS2
MY NETWORK
IS SO SLOW!!!!
AM I GETTING A FAIR DEAL FOR MY NETWORK?
CAN I VERIFY MY NETWORK
VENDOR CLAIMS?
SOUNDS FAMILIAR?
15
2
IS MY NETWORK READY TO
HANDLE TRIPLE PLAY?
DO YOU NEED TO KNOW THIS?
WHICH SEGMENT OF MY NETWORK
MAY POSE PROBLEMS?
DOES MY NETWORK QoS CONFORMS TO
THE SUBSCRIBED STANDARD?
CAN MY NETWORK
RELIABILITY BE IMPROVED?
ALWAYS ON MY MIND…
DOES MY NETWORK FULFILL THE
REQUIREMENTS OF CURRENT AND
FUTURE APPLICATIONS?
AM I SPENDING WISELY ON
UPGRADES?
16
“While current trends suggest that broadband access is progressing rapidly and is likely to
substantially empower the consumer’s Internet-experience, the problems posed by broadband access
for supporting QoS are daunting”
Dr. William Lehr & Dr. Lee McKnight
A Broadband Access Market Framework:
Towards Consumer Service Level Agreements
“Commercial customers, increasingly unhappy with their inability to get predictable QoS for their
Internet applications began to demand SLAs that specified technical performance parameters
analogous to those common for traditional telecom offerings, but more appropriate to packet-based
IP services (e.g., packet delay bounds, jitter, peak and average bandwidth, and committed
information rates).”
In a service level agreement (SLA), a supplier agrees to achieve defined levels of performance and a customer obtains rights and
remedies if the supplier fails to achieve those levels of performance. ……….
The first and most important step in developing an effective SLA is to ask the right questions.
Brad L. Peterson, Partner, Mayer, Brown, Rowe & Maw
Ten Key Questions for Developing Effective Service Level
Agreements
ISSUES: THE EXPERTS VIEW2
“According to Malaysian Communications and Multimedia Commission (MCMC), poor service makes up the bulk or 35% of the complaints”
The Star (7/5/2011)
ISSUES: THE FACTS2
“Malaysia ranks 102 with an average download speed of 2.61Mbps”
The Star (7/5/2011)
ISSUES
1
2
3
4
5
6
7
8
9
INTRODUCING NetTrax 3000
OBJECTIVES
QoS APPLICABLE STANDARDS
TESTING REQUIREMENTS & CONFIGURATION
TESTING PARAMETERS
Q & A
WHAT WE OFFER
18
SAMPLE REPORT
COMPANY PROFILE
10
INTRODUCING NetTrax 30003
19
WHAT
IS IT?
HOW
IT WORKS?
WHY
IT IS NEEDED?
• An Active Testing Software Tools designed
to measure and perform QoS Audit on
Broadband and Data Network
• Able to perform measurement on any IP
Network
• By measuring Broadband Quality of
Services (QoS) Assessment based on
Industry Best Practices and Subscribed
Service Level Agreement (SLA)
• To perform test and collect data in order to
provide an irrefutable analysis based on
Industry Best Practices and Subscribed
Service Level Agreement (SLA)
ISSUES
1
2
3
4
5
6
7
8
9
INTRODUCING NetTrax 3000
OBJECTIVES
QoS APPLICABLE STANDARDS
TESTING REQUIREMENTS & CONFIGURATION
TESTING PARAMETERS
Q & A
WHAT WE OFFER
20
SAMPLE REPORT
COMPANY PROFILE
10
OBJECTIVES4
21
To establish benchmark data on any IP based network
Accurately measure network Quality of Service metrics such as Network Latency, Throughput, Packet Loss, Jitter and Traceroute
Provide Irrefutable Results with Accurate Industry Proven Measurements via Industry Trusted Methodology & Equipment
The Testing Environment shall Emulate actual ‘Real-World’ Deployment
Quickly Isolate, Resolve and Identify Network Connectivity and QoS issues Network wide
Provide GAP Analysis on results
Verify and Validate End-to-end Network Service and Performance Levels
OBJECTIVES OF THE TEST
22
NETTRAX 3000 COMPONENTS7
CONFIGURATION SETTINGS AND TEST INITIATOR
IMMEDIATE VIEWING OF RESULTS
DATA EXTRACTION AND VIEWING
DATA MINING
NETTRAX 3000 CLIENT
SUPPORTS UP TO 4 CONCURRENT TESTS
AUTHORISES, VERIFY AND VALIDATE TEST
CENTRALISED DATA STORAGE FUNCTION
DETERMINES STATUS OF TEST AND NETTRAX CLIENT
NETTRAX 3000 SERVER
ENABLES REMOTE MONITORING OF CURRENT TESTS
ALLOWS REMOTE EXTRACTION OF TEST RESULTS
PROVIDES SECURE ACCESS
RUNS ON ANY WEB BROWSER
NETTRAX 3000 WEB DASHBOARD
23
TESTING CONFIGURATION – SEGMENT A7
PROPOSAL :SEGMENTIZATION TO ‘DISSECT’ NETWORK PROBLEMS
24
TESTING CONFIGURATION – SEGMENT B7
25
TESTING CONFIGURATION – SEGMENT C7
26
SCREEN SHOTS7
ISSUES
1
2
3
4
5
6
7
8
9
INTRODUCING NetTrax 3000
OBJECTIVES
QoS APPLICABLE STANDARDS
TESTING REQUIREMENTS & CONFIGURATION
TESTING PARAMETERS
Q & A
WHAT WE OFFER
27
SAMPLE REPORT
COMPANY PROFILE
10
QoS APPLICABLE STANDARDS5
28
ITU-T Rec. E.800/2101
ITU-T Rec. I.140
ITU-T Rec. X.902
ITU-T Rec. H.223
ITU-T Rec. E.600
ITU-T Rec. E.417
Mandatory Standards for Quality of
Service (Broadband Access Service)
Determination No 1 of 2007
ETNO/ETSI Rec. EG.202 057 1,2,3,
European Telecommunications Network Operators’ Association
European Telecommunications Standard Institute
ISSUES
1
2
3
4
5
6
7
8
9
INTRODUCING NetTrax 3000
OBJECTIVES
QoS APPLICABLE STANDARDS
TESTING REQUIREMENTS & CONFIGURATION
TESTING PARAMETERS
Q & A
WHAT WE OFFER
29
SAMPLE REPORT
COMPANY PROFILE
10
30
NetTrax 3000 FEATURES6
Types of Measurements & Parameters
• Active (Intrusive) Measurements
• Performed on artificially generated traffic – Traffic can be tailored to check network
parameters
• Performed under the same conditions of customer use by generating automatic
connections to a service and by measuring relevant end-to-end parameters
• The Intrusive Tool is implemented in general at the Access Network Interface like
Switch, Router, Gateway, Customer Premise Equipment (CPE) etc.
• Four (4) simultaneous NetTrax Clients can be served by one (1) NetTrax Server at any
one time
• Tests Parameters are configurable – size of packet, subscribed bandwidth, delay
between tests etc.
• Parameters
• Network Latency –Delay in Transmission caused by the Network
• Throughput – Amount of available(actual) bandwidth available to End User
• Packet Loss – Amount of data loss suffered by the End User caused by the network
• Jitter – measure of the variability over time of the packet latency across a network.
A network with constant latency has no jitter.
• Traceroute – to identify the route transverse by packets across an Internet
Protocol (IP) network from source to destination
31
WHERE THE ACTION TAKES PLACE6
TRANSPORT
SESSION
PRESENTATION
APPLICATION
PHYSICAL
DATALINK
NETWORK
1
2
3
4
5
6
7Layer 4 (Transport)
This layer provides transparent transfer of data between end
systems, or hosts, and is responsible for end-to-end error recovery
and flow control. It ensures complete data transfer.
Layer 3 (Network)
This layer provides switching and routing technologies, creating
logical paths, known as virtual circuits for transmitting data from
node to node. Routing and forwarding are functions of this layer
as well as addressing, internetworking, error handling, congestion
control and packet sequencing.
Layer 2 (Datalink)
At this layer, data packets are encoded and decoded into bits. It
furnishes transmission protocol knowledge and management and
handles errors in the physical layer, flow control and frame
synchronization. The data link layer is divided into two sub layers:
The Media Access Control (MAC) layer and the Logical Link Control
(LLC) layer. The MAC sub layer controls how computer on the
network gains access to the data and permission to transmit it.
The LLC layer controls frame synchronization, flow control and
error checking.
NetTrax 3000 performs tests on OSI Layer 2, 3 and 4
32
ACTIVE VS PASSIVE TOOL6
Full Turn
• Monitor condition
on a full capacity
basis
• Monitor time
taken to fill the
bucket
Result : Size of Bucket
Time taken to fill
ACTIVE TEST
Quarter Turn
• Monitor condition
on as is basis
• Monitor time
taken to fill the
bucket
Result : Size of Bucket
Time taken to fill
PASSIVE TEST
In Active Test, data will be generated artificially to test the capacity
of the bandwidth and other factors that can affect the network
6
33
TESTING REQUIREMENTS
NetTrax 3000 Server NetTrax 3000 Client
Network Connection Valid Network
Connection
Valid Network
Connection
IP Address Static IP Dynamic IP
Able to function behind
Firewall?
Yes, must open ports
5000-5010 (ICMP Ports)
for NetTrax
Yes, must open ports
5000-5010 (ICMP Ports)
for NetTrax
Able to function behind
NAT (Network Address
Translation)?
Yes, need to activate Port
Forwarding
Yes, need to activate Port
Forwarding
Additional Requirements • To provide power
source
• Secure storage area
• To provide access to
test area
ISSUES
1
2
3
4
5
6
7
8
9
INTRODUCING NetTrax 3000
OBJECTIVES
QoS APPLICABLE STANDARDS
TESTING REQUIREMENTS & CONFIGURATION
TESTING PARAMETERS
Q & A
WHAT WE OFFER
34
SAMPLE REPORT
COMPANY PROFILE
10
TESTING PARAMETERS7
35
Measurement of Network Latency
Network latency is simply defined as the time delay observed as data transmits fromone point to another. Usually, to determine network latency, the origin and destinationpoints are used. In some cases, network latency may be defined by the time it takessome form of data to make a full circuit back to the originating point. It is also knownas lag or delay.
Speed of sending a packet of data fromone source to destination and returnedback to source. Also call Round TripTime (RTT). This will show the networkspeed in milliseconds (ms)
What do we get by testing latency?
What is network latency?
36
Measurement of Throughput
TESTING PARAMETERS7
Throughput is the rate at which a computer or network sends or receives data.It therefore is a good measure of the channel capacity of a communicationslink, and connections to the internet are usually rated in terms of how manybits per second (bit/s).
What is Throughput?
What do we get by testing Throughput?
It will tell how the network capacity.
37
TESTING PARAMETERS7
Measurement of Packet Loss
Packet loss occurs when one or more packets of data travelling acrossa computer network fail to reach their destination in time and in the rightsequence.
What is packet loss?
What do we get from packet loss result?
It will tell the network reliability.
38
TESTING PARAMETERS7
Measurement of Jitter
In voice over IP (VoIP) and video over IP, jitter is the variation in the timebetween packets arriving, caused by network congestion, timing drift, or routechanges.
What do we get from Jitter result?
It will tell the readiness of the networkfor triple play application (Voice, Videoand Data)
What is Jitter?
39
TESTING PARAMETERS7
Traceroute Measurement
Traceroute shows you the route transverse by the packet over the network,listing all the intermediate network elements a packet must pass through toget to its destination.
What is Traceroute?
What we get by testing Traceroute?
It can help you determine why yourconnections to a given server might be poor,and can often help you figure out whereexactly the problem is. It also shows youhow systems are connected to each other,letting you see how your ISP connects to theInternet as well as how the target system isconnected.
40
TESTING EQUIPMENTS7
NETTRAX 3000 SERVER
- 1U Rack Mounted
- Xeon Quad Core processor
- 4GB RAM
- 320 GB HDD
- Network Card
- MS Windows 7
- NetTrax 3000 Server Software
- MS Excel and MS Access
NETTRAX 3000 CLIENT
- Laptop or Desktop
- Intel i3 processor
- 4GB RAM
- 320 GB HDD
- Network Card
- MS Windows 7
- NetTrax 3000 Client Software
- MS Excel and MS Access
MINIMUM REQUIREMENTS
ISSUES
1
2
3
4
5
6
7
8
9
INTRODUCING NetTrax 3000
OBJECTIVES
QoS APPLICABLE STANDARDS
TESTING REQUIREMENTS & CONFIGURATION
TESTING PARAMETERS
Q & A
WHAT WE OFFER
41
SAMPLE REPORT
COMPANY PROFILE
10
8
42
SAMPLE REPORT
What the Graph shows:Latency results of test environment against Regulatory Authority Mandatory Standards.Initial Analysis:Partial compliance. Cause of latency need to be identified.
What the Graph shows:Bandwidth/Throughput (TCP) results of test environment against Regulatory Authority Mandatory Standards.Initial Analysis:Download is good but Upload needs to be rectified.
What the Graph shows:Packet Loss (TCP) results of test environment against Regulatory Authority Mandatory Standards.Initial Analysis:All tests complied to the Regulatory Authority Mandatory Standards. However, Test 2 shows abnormal packet loss.
ISSUES
1
2
3
4
5
6
7
8
9
INTRODUCING NetTrax 3000
OBJECTIVES
QoS APPLICABLE STANDARDS
TESTING REQUIREMENTS & CONFIGURATION
TESTING PARAMETERS
Q & A
WHAT WE OFFER
43
SAMPLE REPORT
COMPANY PROFILE
10
9
44
OUR OFFERINGS
OUTRIGHT
•Outright purchase of 1 X NetTrax Server Software and 4 X NetTrax Client Software
•Inclusive of Hardware, Training and Installation (subject to location)
RENTAL SOFTWARE
•Rental of NetTrax Software to perform test for a period of time
•Excluding hardware, installation, training and data analysis & reporting
RENTAL SERVICES
•Rental of NetTrax Software to perform test for a period of time with professional services
•Our consultants will perform testing and compile the results analysis and reporting
MANAGED SERVICES
•For a fixed monthly fee over longer period of time we will perform periodical test and deliver reports to the customer
•No initial investments from customer eliminating CAPEX and OPEX
9
45
SAMPLE APPLICATIONS
46
9a Testing Details
Tools For MeasurementsActive measurement needed due to lack of statistics
• NetTrax 3000
• TEMS Investigation for UMTS (Software)
• TEMS Scanner (Software + HW)
• External GPS
• User Equipment (UE)
• TEMS DeskCat for post processing
• Google Earth
• MS Access/Excel based tools
47
9a Testing Details
Testing Location - Subang Bestari
101.54383.1858
48
9a Testing Details
Testing Location - Subang Bestari
49
9a Testing Details
Testing Location - Senawang
101.98582.6736
50
9a Testing Details
Testing Location - Senawang
51
Subang Bestari
Senawang Jaya
9b Methodology
Tool Deployment
52
9c Methodology
Result Calculation
53
9d Methodology
Result Calculation
54
9e Methodology
RSCP
Ec/No -17 -16 -15 -14 -13 -12 -11 -10 -9 -8 -7 -6 -5 -4 -3 -2 -1
> 5
More than 5 dBm
GOODPOOR MODERATE
> 3 to ≤ 5
Between > 3 ≤ 5 dBm
≤ 3
Less than ≤ 3
Ec/No
POOR
RSCP -105 -104 -103 -102 -101 -100 -99 -98 -97 -96 -95 -94 -93 -92 -91 -90 -89 -88 -87 -86 -85 -84 -83 -82 -81 -80 -79 -78 -77 -76 -75 -74
MODERATE GOOD
≥ 85
More than 85 dBm
< -97
Less than -95 dBm Between ≥ 95 and < -85 dBm
≥ 97 to < -85
9f Results - Summary
55
Test TypeResult Subang
Bestari
Result Senawang
JayaRemarks
Latency not comply comply
Throughput TCP Upload not comply not comply
Throughput TCP Download not comply not comply
Throughput UDP Upload not comply not comply
Throughput UDP Download comply not comply
TCP Packet Loss not comply comply
UDP Packet Loss Upload comply comply
UDP Packet Loss Download comply comply
UDP Jitter Upload not comply not comply
UDP Jitter Download comply not comply
Traceroute
Ec/No -4.83 -9.81 Average Reading (dBm)
RSCP -68.61 -83.79 Average Reading (dBm)
9g Results - Summary
56
Subang Bestari (Based on PIR) Subang Bestari (Average Results)
Test TypePoint A
100m
Point B
400m
Point C
600m
Point D
1000mTest Type
Point A
100m
Point B
400m
Point C
600m
Point D
1000m
Latency 6.67% 0.00% 60.00% 3.33% Latency (ms) 310 339 200 345
Throughput TCP Upload 0.00% 0.00% 0.00% 0.00% Throughput TCP Upload (Mbps) 0.220 0.281 0.256 0.280
Throughput TCP Download 0.00% 0.00% 0.00% 0.00% Throughput TCP Download (Mbps) 1.176 1.224 1.192 1.219
Throughput UDP Upload 33.33% 16.67% 13.33% 23.33% Throughput UDP Upload (Mbps) 2.022 1.851 1.740 2.106
Throughput UDP Download 90.00% 100.00% 96.67% 100.00% Throughput UDP Download (Mbps) 2.885 3.109 3.02 3.145
TCP Packet Loss 10.00% 4.17% 5.00% 6.00% TCP Packet Loss 10.00% 4.17% 5.00% 6.00%
UDP Packet Loss Upload 0.85% 0.82% 0.85% 0.85% UDP Packet Loss Upload 0.85% 0.82% 0.85% 0.85%
UDP Packet Loss Download 0.96% 0.93% 0.96% 0.96% UDP Packet Loss Download 0.96% 0.93% 0.96% 0.96%
UDP Jitter Upload (ms) 92.429 79.653 69.534 84.490 UDP Jitter Upload (ms) 92.429 79.653 69.534 84.490
UDP Jitter Download (ms) 20.280 6.152 16.821 24.741 UDP Jitter Download (ms) 20.280 6.152 16.821 24.741
Traceroute - - - - Traceroute - - - -
Ec/No (dBm) -5.900 -4.160 -3.410 -5.870 Ec/No (dBm) -5.900 -4.160 -3.410 -5.870
RSCP (dBm) -64.150 -60.650 -66.540 -83.100 RSCP (dBm) -64.150 -60.650 -66.540 -83.100
Senawang Jaya (Based on PIR) Senawang Jaya (Average Results)
Test TypePoint A
100m
Point B
400m
Point C
600m
Point D
1000mTest Type
Point A
100m
Point B
400m
Point C
600m
Point D
1000m
Latency 100.00% 96.67% 100.00% 96.67% Latency (ms) 87 115 99 118
Throughput TCP Upload 0.00% 0.00% 0.00% 0.00% Throughput TCP Upload (Mbps) 0.300 0.293 0.310 0.263
Throughput TCP Download 0.00% 0.00% 0.00% 0.00% Throughput TCP Download (Mbps) 0.898 0.850 0.932 0.635
Throughput UDP Upload 26.67% 23.33% 36.67% 56.67% Throughput UDP Upload (Mbps) 2.216 2.204 2.472 3.062
Throughput UDP Download 3.33% 0.00% 0.00% 0.00% Throughput UDP Download (Mbps) 2.015 1.840 1.419 0.642
TCP Packet Loss 0.00% 0.83% 2.50% 3.33% TCP Packet Loss 0.00% 0.83% 2.50% 3.33%
UDP Packet Loss Upload 0.85% 0.85% 0.86% 0.89% UDP Packet Loss Upload 0.85% 0.85% 0.86% 0.89%
UDP Packet Loss Download 0.97% 0.97% 0.97% 0.97% UDP Packet Loss Download 0.97% 0.97% 0.97% 0.97%
UDP Jitter Upload (ms) 46.608 48.608 52.614 94.432 UDP Jitter Upload (ms) 46.608 48.608 52.614 94.432
UDP Jitter Download (ms) 54.312 66.984 96.588 561.597 UDP Jitter Download (ms) 54.312 66.984 96.588 561.597
Traceroute - - - - Traceroute - - - -
Ec/No (dBm) -7.47 -8.74 -10.28 -12.74 Ec/No (dBm) -7.47 -8.74 -10.28 -12.74
RSCP (dBm) -67.25 -78.74 -93.45 -95.74 RSCP (dBm) -67.25 -78.74 -93.45 -95.74
9h Results - Summary
57
Subang BestariLatency
Throughput
Packet Loss
0
50
100
150
200
250
300
350
400
-90.00
-80.00
-70.00
-60.00
-50.00
-40.00
-30.00
-20.00
-10.00
0.00
Point A Point B Point C Point D
ms
dB
m
Ec/No (dBm) RSCP (dBm) Latency (ms)
0.000
0.050
0.100
0.150
0.200
0.250
0.300
-90.00
-80.00
-70.00
-60.00
-50.00
-40.00
-30.00
-20.00
-10.00
0.00
10.00
Point A Point B Point C Point D
Mb
ps
dB
m
Ec/No (dBm) RSCP (dBm)
Throughput TCP Download (Mbps) Throughput UDP Upload (Mbps)
Throughput UDP Download (Mbps) Throughput TCP Upload (Mbps)
0.00%
2.00%
4.00%
6.00%
8.00%
10.00%
12.00%
-90.00
-80.00
-70.00
-60.00
-50.00
-40.00
-30.00
-20.00
-10.00
0.00
10.00
Point A Point B Point C Point D
dB
m
Ec/No (dBm) RSCP (dBm)
Packet Loss UDP Upload Packet Loss UDP Download
Packet Loss TCP
9i Results - Summary
58
Senawang JayaLatency
Throughput
Packet Loss
0
20
40
60
80
100
120
140
-120
-100
-80
-60
-40
-20
0
Point A Point B Point C Point D
ms
dB
m
Ec/No (dBm) RSCP (dBm) Latency (ms)
0.78%
0.80%
0.82%
0.84%
0.86%
0.88%
0.90%
0.92%
0.94%
0.96%
0.98%
-120
-100
-80
-60
-40
-20
0
20
Point A Point B Point C Point D
dB
m
Ec/No (dBm) RSCP (dBm)
Packet Loss TCP Packet Loss UDP Upload
Packet Loss UDP Download
0.230
0.240
0.250
0.260
0.270
0.280
0.290
0.300
0.310
0.320
-120
-100
-80
-60
-40
-20
0
20
Point A Point B Point C Point D
Mb
ps
dB
m
Ec/No (dBm) RSCP (dBm)
Throughput TCP Download (Mbps) Throughput UDP Upload (Mbps)
Throughput UDP Download (Mbps) Throughput TCP Upload (Mbps)
ISSUES
1
2
3
4
5
6
7
8
9
INTRODUCING NetTrax 3000
OBJECTIVES
QoS APPLICABLE STANDARDS
TESTING REQUIREMENTS & CONFIGURATION
TESTING PARAMETERS
Q & A
WHAT WE OFFER
59
SAMPLE REPORT
COMPANY PROFILE
10
Q & A10
60
Q & A10
61
For further information:
NFE Consulting Sdn. Bhd.
15C, 15th Floor, Plaza Ampang City
Jalan Ampang
50450 Kuala Lumpur
Tel : (6)03 – 4251 6228
Fax : (6)03 – 4252 6228
URL : www.nfeconsulting.com
E-Mail : [email protected]