Linux - NetworkingThis forum is for any issue related to networks or networking.
Routing, network cards, OSI, etc. Anything is fair game.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
hello all i have been asked to calculate the RTT(Round trip time) of a network, i have created a program that sends a packet to a server program then the server program sends a packet back to the client. so far i have got it to calculate the time in milliseconds to send the packet to the server and calculate the difference here is a sample output from my program:
SERVER:
cookie@HAL ~$ mono server.exe
Server started...
port: 2555
Waiting for client input:
time: 469842688960
Client Time: 469842688960
Server Time: 469842933160
Time taken is: 244200 ( milliseconds )
CLIENT:
cookie@HAL ~$ mono client.exe 127.0.0.1
type help for information, exit or quit to close the application
Waiting for Message to send:
time
time: 469842688960
server: reply no0
so the chain of events are: i type time in my client program it takes 'tick' of the time of day: 469842688960 (in milliseconds)
it sends that in a packet to the server who finds out the 'tick' of that time of day: 469842688960
the server then takes the client time: 469842688960 and the server time 469842933160 and detects it to make the difference Time taken is: 244200 ( milliseconds )
.
so from this is it possible to find the RTT? cheers
I am a poor sys admin, not a programmer, but I think you could use the same ping aprouch: Sends a single packet to the target machine, and get back the reply. Than measure the time elapsed beetween two. Send several packets in a row and calculate the arithmetic mean. Each packet must have a label or something like that in the payload area, so you can track the order its arrives.
ahr cheers, thats more what i was after, i thought that it was going to be some sort of hard core calculation!! so do i measure it in milliseconds or nanoseconds etc....does it matter? i think that im going to measure it in milliseconds unless someone suggests otherwise. cheers trscookie.
on the left hand side, i know there is no source available yet and its a little buggy, but it will be released in due time. (you need the .net arch to run it (or mono)) im quite proud of it as its my first c# project!
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.