netSimulateDatagramPacketLoss function net
Enable simulated datagram socket failures.
Often times, testing a networked app on your development machine--which might have a wired connection to a fast, reliable network service--won't expose bugs that happen when networks intermittently fail in the real world, when the wifi is flakey and firewalls get in the way.
This function allows you to tell the library to pretend that some percentage of datagram socket data transmission will fail.
The library will randomly lose packets (both incoming and outgoing) at an
average matching percent_loss
. Setting this to zero (the default) will
disable the simulation. Setting to 100 means everything fails
unconditionally and no further data will get through. At what percent the
system merely borders on unusable is left as an exercise to the app
developer.
This is intended for debugging purposes, to simulate real-world conditions that are various degrees of terrible. You probably should not call this in production code, where you'll likely see real failures anyhow.
\param sock The socket to set a failure rate on. \param percent_loss A number between 0 and 100. Higher means more failures. Zero to disable.
\threadsafety It is safe to call this function from any thread.
\since This function is available since SDL_net 3.0.0.
extern SDL_DECLSPEC void SDLCALL NET_SimulateDatagramPacketLoss(NET_DatagramSocket *sock, int percent_loss)
Implementation
void netSimulateDatagramPacketLoss(
Pointer<NetDatagramSocket> sock,
int percentLoss,
) {
final netSimulateDatagramPacketLossLookupFunction = _libNet
.lookupFunction<
Void Function(Pointer<NetDatagramSocket> sock, Int32 percentLoss),
void Function(Pointer<NetDatagramSocket> sock, int percentLoss)
>('NET_SimulateDatagramPacketLoss');
return netSimulateDatagramPacketLossLookupFunction(sock, percentLoss);
}