I am writing a function that generates n random numbers x such that xmin < x < xmax. This is easy to do with uniform distribution using rand().

```
int points[n];
for (int i = 0; i < n; i++) {
points[i] = rand() % (xmax - xmin) + xmin;
}
```

However, I would like to control the distribution so that the probability of a given x value is `px = (px2 * (x - xmin) + px1 * (xmax - x)) / (xmax - xmin)`

, where px1 and px2 are constants. In other words, a linear distribution.

I can fake this by partitioning the interval into sufficiently small discrete intervals and using the algorithm above for each one, with n proportional to the average probability across the subinterval. However, I would prefer to apply a continuous distribution across the interval. Can this be done, either using rand() or with another approach?