140609-0751 EDT
The question in the OP is perfectly clear. It states the current is a constant. That means a constant current source and it does not matter what the circuit is that generates the constant current.
In a practical circuit that most of you work with you assume the source is a constant voltage, but in the real world in an electrical distribution system it is not real constant, but for convenience of many calculations it is assumed constant. If this practical circuit is an ideal constant voltage of 240 V with a load resistor of 240 ohms and a series switch contact with a voltage drop of 0.01 V, a contact resistance of about 0.01 ohms, then if the contact resistance increases to about 0.02 ohms what is the change in current? What is the change in voltage across the contact?
The question in the OP is a very good question and doesn't even require close reading to understand it. It is stated in black and white that the current is constant, and it represents an approximation to a real world problem.
.