Q:

sofia wants to rent an apartment. she visits apartment a which charges $100 for a one time deposit and $400 per month for rent. she visits apartment b which charges $500 for deposit, but only charges $350 per month in rent. write and solve an inequality that shows how many months it would take for the total paid to be less for apartment b.

Accepted Solution

A:
Given that one time charge of appartment "a" = $100Given that monthly rent of appartment "a" = $400say the number of months = xthen total cost of the rent for apartment "a" = 100+400x
Given that one time charge of appartment "b" = $500Given that monthly rent of appartment "b" = $350then total cost of the rent for apartment "b" = 500+350x
Now we have to make an inequality so that number of months it would take for the total paid to be less for apartment b.that means 100+400x is less than 500+350x
Hence required inequality is 100+400x<500+350x.
Now we have to solve this inequality.400x-350x<500-10050x<400divide both side by 50x<8hence final answer is number of months should be less than 8.