Lack of AI governance in real estate poses a risk – 7 tips for safe use

Only 6 percent have guidelines, and there is a significant gap between management’s perception of the use of artificial intelligence and the actual use among employees.
Elisabeth Aspaas Runsjø, Head of the AI Compliance & Legal team at BDO, and Sigmund Olav Lie, Industry Leader for Real Estate at BDO.

With the right approach, the technology can provide a competitive advantage – but without guidelines, the consequences can be serious.

Imagine an AI solution automatically responding to customer enquiries for a real estate company. The system retrieves information from various data sources to provide answers, but an error in the data foundation leads to sensitive information about former tenants or internal contract terms being shared with unauthorised parties.

Such a scenario is not unlikely when artificial intelligence is used without clear boundaries. AI has the potential to improve efficiency, insight and decision-making – but without control, it can also pose a risk.


AI creates value – but also challenges

AI creates value – but also challenges
As many as 83 percent of those who have adopted AI in the real estate industry report that the technology adds value. This is shown by a survey conducted by Norstat on behalf of BDO this year. Nevertheless, the same survey shows that only 6 percent have guidelines for its use, and that there is a significant gap between what management believes and what employees actually do: Only 24 percent assume that employees use their own AI tools at work, while Microsoft’s latest Work Trend Index shows that as many as 59 percent actually do.

– It’s a clear signal that real estate players who are already working with AI should also take action on security. The technology is powerful, but also so new and complex that we must ensure safe and responsible implementation,’ says Sigmund Olav Lie, Industry Leader for Real Estate at BDO.


Where AI Creates Risk

– Without guidelines, there’s a risk that the technology will be misused, or that business-critical information may be exposed," says Elisabeth Aspaas Runsjø, Head of the AI Compliance & Legal team at BDO.

The use of AI involves risks on several levels:

  • Legal: Breaches of GDPR and other legislation if personal data is processed incorrectly, without valid legal basis, or without necessary risk assessments – for example, in cases of automated rejections of rental applications, ranking of prospective tenants, or AI-based facial recognition or use of biometrics for access to commercial buildings.
  • Organisational: Lack of governance and routines can lead to unclear roles and responsibilities, lack of control over which AI tools are being used, and employees adopting solutions without training – for example, for automated price adjustments or customer segmentation. 
  • Ethical: Biases in the data foundation can result in unfair or discriminatory decisions and outcomes – for example, if rental applicants are automatically filtered out based on profiling without human assessment, or if AI prioritises access, resources or offerings in commercial buildings based on previous usage patterns that favour certain groups.

There are many good AI solutions on the market, but also many open solutions that say nothing about how the data you input is processed.

– Use of open and non-transparent AI solutions – for example, to analyse tenants, manage access control for employees and suppliers, or control pricing and space – without clear oversight of purpose, data types and security, can lead to privacy breaches and leakage of sensitive customer data or internal strategies related to the property portfolio," says Runsjø.


7 tips for the safe and effective use of AI in real estate

  1. Map the actual use of AI among employees and ensure that the findings are included in risk assessments as part of the organisation’s internal controls.
  2. Establish clear guidelines for AI use that cover both legal requirements (such as data protection, information security and transparency) and internal principles for ethical and responsible use.
  3. Anchor responsibility in the governance system: The use of AI is a management responsibility and should be integrated into the organisation’s internal control, risk management and reporting structure on par with other strategic initiatives.
  4. Start using AI in low-risk areas, and assess how sensitive the data is before implementation, even in seemingly simple processes such as document handling.
  5. Document AI use thoroughly: Describe the purpose, data types, legal basis for processing, and assess risks to privacy and information security.
  6. Build interdisciplinary competence, for example through an AI working group that includes legal, technological and organisational expertise.
  7. Conduct specific assessments when using AI in profiling, credit assessments or rental applications, and evaluate the risk of discrimination and bias.

The real estate industry has much to gain from leveraging artificial intelligence. But like any powerful tool, it requires responsibility. By taking control of how the technology is used, players in the industry can both minimise risk and maximise benefits. The future belongs to those who dare to think differently – and think safely.