Personalisation with Privacy

More and more, businesses are interested in targeting their services
or products to the personal needs of their clients/customers.
Recommender systems, for example, try to determine customers' personal
preferences by analysing their past purchases and linking their future
actions with the actions of other customers who are found to have
behaved similarly in the past. For personalisation to be effective,
personalisation algorithms depend strongly on the availability of
information about the customer's past behaviour. However, in recent
times, public awareness of the amount of personal information that is
being exploited by businesses is growing, and people are becoming more
aware of the need to protect their privacy. The challenge then becomes
how we can develop algorithms that provide good personalised
recommendations, while respecting privacy levels imposed the customer.

One model of privacy support in machine learning systems is
differential privacy, which aims to maintain privacy by ensuring that
the removal of single record from the input data does not change the
statistics of the output of the algorithm significantly.  A
differentially private recommendation algorithm aims to protect
customer data from third party privacy attackers. In such a scenario,
the recommender system has access to the customer's data and seeks to
ensure protection of that data. Another scenario is that the
individual customer may want to keep their data private even from the
system providing the recommendation.

This project will study the relationship between privacy and
personalisation and will implement and evaluate one or more
recommendation algorithms that aim to respect privacy. Decisions on
the exact algorithms to be implemented and evaluated will be made
after some initial research at the beginning of the project. The
algorithms will be evaluated in terms of the level of privacy support
that they provide and the quality of recommendation   they maintain.