AI can’t stop vote buying in 2027 polls - Don

Prof Adenike Osofisan of the University of Ibadan has said that while the deployment of Artificial Intelligence and other technological tools would enhance the credibility of elections, it would not curb fundamental human and structural electoral issues like vote buying. Osofisan spoke during a University of Ibadan panel discussion titled, “AI and the 2027 General Elections in Nigeria: The Realities, The Fakes and The Absurd.” The event, organised by the UI Senior Staff Club in collaboration with Diamond FM and UITV, provided a platform for stakeholders to examine both the promise and peril of artificial intelligence in safeguarding Nigeria’s electoral future. In her remarks, Osofisan identified critical weaknesses in Nigeria’s electoral process, noting that manipulation often begins at the voter registration stage. “Election rigging often starts long before election day,” she said, adding that political actors exploit systemic loopholes through party agents who engage in vote buying. According to her, while AI can enhance transparency and strengthen electoral processes, it has limitations in directly addressing entrenched human practices such as the inducement of voters. “AI can support transparency and improve monitoring, but it cannot, on its own, eliminate vote buying. Human complicity and weaknesses in the electoral framework are issues technology alone cannot fix,” she stressed. On his part, the Dean of Social Sciences, University of Lagos, Prof Adelaja Odukoya, cautioned that while the misuse of AI is a global phenomenon, Nigeria’s fragile institutions heighten the risks. “AI is not an autonomous actor; it is an amplifier,” he said, adding that its impact would depend largely on existing political power struggles and institutional strength. Also speaking, an Information Technology expert, Folajimi Fakoya, urged the Federal Government, the Independent National Electoral Commission and other stakeholders to leverage AI’s potential while guarding against its risks. Related News Ex-INEC officials warn of flaws in Electoral Act 2026 Kwara unveils coding, digital literacy programme for schools 2027: ADC rejects revised INEC schedule, says it's designed for Tinubu’s re‑election Fakoya highlighted AI’s transformative potential in areas such as voter registration and education, which he described as foundational to credible elections. He said specially designed chatbots could revolutionise voter education by providing real-time guidance to millions of citizens simultaneously. “A single chatbot can respond to over one million queries at the same time. No human can do that. Such platforms could promote inclusivity and boost voter turnout among digitally literate populations,” he said. Fakoya added that AI’s generative capabilities could be explored to produce educational content in locally relevant languages. However, he warned that the technology remains a double-edged sword capable of producing what he described as “digitally true but physically false” content that can mislead voters. He explained that such manipulated content could be massively targeted and hyper-personalised through algorithms designed to exploit individual biases. “A voter sees a video of violence at their polling unit on election morning; a party loyalist is confronted with an image of their candidate embracing a rival; a religious follower views a manipulated picture of their spiritual leader, an image calculated to incite violence,” he said. According to him, such tactics could suppress voter participation, inflame ethnic and religious tensions, and overstretch security agencies. To mitigate these risks, Fakoya called for legislation compelling social media platforms to deploy AI detection tools capable of clearly labelling content generated or manipulated by artificial intelligence. Citing a Yoruba proverb that “the same white man who made the pencil also made the eraser,” he stressed the need for transparency measures, rapid fact-checking systems and aggressive digital literacy campaigns. Osofisan spoke during a University of Ibadan panel discussion titled, “AI and the 2027 General Elections in Nigeria: The Realities, The Fakes and The Absurd.” The event, organised by the UI Senior Staff Club in collaboration with Diamond FM and UITV, provided a platform for stakeholders to examine both the promise and peril of artificial intelligence in safeguarding Nigeria’s electoral future. In her remarks, Osofisan identified critical weaknesses in Nigeria’s electoral process, noting that manipulation often begins at the voter registration stage. “Election rigging often starts long before election day,” she said, adding that political actors exploit systemic loopholes through party agents who engage in vote buying. According to her, while AI can enhance transparency and strengthen electoral processes, it has limitations in directly addressing entrenched human practices such as the inducement of voters. “AI can support transparency and improve monitoring, but it cannot, on its own, eliminate vote buying. Human complicity and weaknesses in the electoral framework are issues technology alone cannot fix,” she stressed. On his part, the Dean of Social Sciences, University of Lagos, Prof Adelaja Odukoya, cautioned that while the misuse of AI is a global phenomenon, Nigeria’s fragile institutions heighten the risks. “AI is not an autonomous actor; it is an amplifier,” he said, adding that its impact would depend largely on existing political power struggles and institutional strength. Also speaking, an Information Technology expert, Folajimi Fakoya, urged the Federal Government, the Independent National Electoral Commission and other stakeholders to leverage AI’s potential while guarding against its risks. Related News Ex-INEC officials warn of flaws in Electoral Act 2026 Kwara unveils coding, digital literacy programme for schools 2027: ADC rejects revised INEC schedule, says it's designed for Tinubu’s re‑election Fakoya highlighted AI’s transformative potential in areas such as voter registration and education, which he described as foundational to credible elections. He said specially designed chatbots could revolutionise voter education by providing real-time guidance to millions of citizens simultaneously. “A single chatbot can respond to over one million queries at the same time. No human can do that. Such platforms could promote inclusivity and boost voter turnout among digitally literate populations,” he said. Fakoya added that AI’s generative capabilities could be explored to produce educational content in locally relevant languages. However, he warned that the technology remains a double-edged sword capable of producing what he described as “digitally true but physically false” content that can mislead voters. He explained that such manipulated content could be massively targeted and hyper-personalised through algorithms designed to exploit individual biases. “A voter sees a video of violence at their polling unit on election morning; a party loyalist is confronted with an image of their candidate embracing a rival; a religious follower views a manipulated picture of their spiritual leader, an image calculated to incite violence,” he said. According to him, such tactics could suppress voter participation, inflame ethnic and religious tensions, and overstretch security agencies. To mitigate these risks, Fakoya called for legislation compelling social media platforms to deploy AI detection tools capable of clearly labelling content generated or manipulated by artificial intelligence. Citing a Yoruba proverb that “the same white man who made the pencil also made the eraser,” he stressed the need for transparency measures, rapid fact-checking systems and aggressive digital literacy campaigns. The event, organised by the UI Senior Staff Club in collaboration with Diamond FM and UITV, provided a platform for stakeholders to examine both the promise and peril of artificial intelligence in safeguarding Nigeria’s electoral future. In her remarks, Osofisan identified critical weaknesses in Nigeria’s electoral process, noting that manipulation often begins at the voter registration stage. “Election rigging often starts long before election day,” she said, adding that political actors exploit systemic loopholes through party agents who engage in vote buying. According to her, while AI can enhance transparency and strengthen electoral processes, it has limitations in directly addressing entrenched human practices such as the inducement of voters. “AI can support transparency and improve monitoring, but it cannot, on its own, eliminate vote buying. Human complicity and weaknesses in the electoral framework are issues technology alone cannot fix,” she stressed. On his part, the Dean of Social Sciences, University of Lagos, Prof Adelaja Odukoya, cautioned that while the misuse of AI is a global phenomenon, Nigeria’s fragile institutions heighten the risks. “AI is not an autonomous actor; it is an amplifier,” he said, adding that its impact would depend largely on existing political power struggles and institutional strength. Also speaking, an Information Technology expert, Folajimi Fakoya, urged the Federal Government, the Independent National Electoral Commission and other stakeholders to leverage AI’s potential while guarding against its risks. Related News Ex-INEC officials warn of flaws in Electoral Act 2026 Kwara unveils coding, digital literacy programme for schools 2027: ADC rejects revised INEC schedule, says it's designed for Tinubu’s re‑election Fakoya highlighted AI’s transformative potential in areas such as voter registration and education, which he described as foundational to credible elections. He said specially designed chatbots could revolutionise voter education by providing real-time guidance to millions of citizens simultaneously. “A single chatbot can respond to over one million queries at the same time. No human can do that. Such platforms could promote inclusivity and boost voter turnout among digitally literate populations,” he said. Fakoya added that AI’s generative capabilities could be explored to produce educational content in locally relevant languages. However, he warned that the technology remains a double-edged sword capable of producing what he described as “digitally true but physically false” content that can mislead voters. He explained that such manipulated content could be massively targeted and hyper-personalised through algorithms designed to exploit individual biases. “A voter sees a video of violence at their polling unit on election morning; a party loyalist is confronted with an image of their candidate embracing a rival; a religious follower views a manipulated picture of their spiritual leader, an image calculated to incite violence,” he said. According to him, such tactics could suppress voter participation, inflame ethnic and religious tensions, and overstretch security agencies. To mitigate these risks, Fakoya called for legislation compelling social media platforms to deploy AI detection tools capable of clearly labelling content generated or manipulated by artificial intelligence. Citing a Yoruba proverb that “the same white man who made the pencil also made the eraser,” he stressed the need for transparency measures, rapid fact-checking systems and aggressive digital literacy campaigns. In her remarks, Osofisan identified critical weaknesses in Nigeria’s electoral process, noting that manipulation often begins at the voter registration stage. “Election rigging often starts long before election day,” she said, adding that political actors exploit systemic loopholes through party agents who engage in vote buying. According to her, while AI can enhance transparency and strengthen electoral processes, it has limitations in directly addressing entrenched human practices such as the inducement of voters. “AI can support transparency and improve monitoring, but it cannot, on its own, eliminate vote buying. Human complicity and weaknesses in the electoral framework are issues technology alone cannot fix,” she stressed. On his part, the Dean of Social Sciences, University of Lagos, Prof Adelaja Odukoya, cautioned that while the misuse of AI is a global phenomenon, Nigeria’s fragile institutions heighten the risks. “AI is not an autonomous actor; it is an amplifier,” he said, adding that its impact would depend largely on existing political power struggles and institutional strength. Also speaking, an Information Technology expert, Folajimi Fakoya, urged the Federal Government, the Independent National Electoral Commission and other stakeholders to leverage AI’s potential while guarding against its risks. Related News Ex-INEC officials warn of flaws in Electoral Act 2026 Kwara unveils coding, digital literacy programme for schools 2027: ADC rejects revised INEC schedule, says it's designed for Tinubu’s re‑election Fakoya highlighted AI’s transformative potential in areas such as voter registration and education, which he described as foundational to credible elections. He said specially designed chatbots could revolutionise voter education by providing real-time guidance to millions of citizens simultaneously. “A single chatbot can respond to over one million queries at the same time. No human can do that. Such platforms could promote inclusivity and boost voter turnout among digitally literate populations,” he said. Fakoya added that AI’s generative capabilities could be explored to produce educational content in locally relevant languages. However, he warned that the technology remains a double-edged sword capable of producing what he described as “digitally true but physically false” content that can mislead voters. He explained that such manipulated content could be massively targeted and hyper-personalised through algorithms designed to exploit individual biases. “A voter sees a video of violence at their polling unit on election morning; a party loyalist is confronted with an image of their candidate embracing a rival; a religious follower views a manipulated picture of their spiritual leader, an image calculated to incite violence,” he said. According to him, such tactics could suppress voter participation, inflame ethnic and religious tensions, and overstretch security agencies. To mitigate these risks, Fakoya called for legislation compelling social media platforms to deploy AI detection tools capable of clearly labelling content generated or manipulated by artificial intelligence. Citing a Yoruba proverb that “the same white man who made the pencil also made the eraser,” he stressed the need for transparency measures, rapid fact-checking systems and aggressive digital literacy campaigns. “Election rigging often starts long before election day,” she said, adding that political actors exploit systemic loopholes through party agents who engage in vote buying. According to her, while AI can enhance transparency and strengthen electoral processes, it has limitations in directly addressing entrenched human practices such as the inducement of voters. “AI can support transparency and improve monitoring, but it cannot, on its own, eliminate vote buying. Human complicity and weaknesses in the electoral framework are issues technology alone cannot fix,” she stressed. On his part, the Dean of Social Sciences, University of Lagos, Prof Adelaja Odukoya, cautioned that while the misuse of AI is a global phenomenon, Nigeria’s fragile institutions heighten the risks. “AI is not an autonomous actor; it is an amplifier,” he said, adding that its impact would depend largely on existing political power struggles and institutional strength. Also speaking, an Information Technology expert, Folajimi Fakoya, urged the Federal Government, the Independent National Electoral Commission and other stakeholders to leverage AI’s potential while guarding against its risks. Related News Ex-INEC officials warn of flaws in Electoral Act 2026 Kwara unveils coding, digital literacy programme for schools 2027: ADC rejects revised INEC schedule, says it's designed for Tinubu’s re‑election Fakoya highlighted AI’s transformative potential in areas such as voter registration and education, which he described as foundational to credible elections. He said specially designed chatbots could revolutionise voter education by providing real-time guidance to millions of citizens simultaneously. “A single chatbot can respond to over one million queries at the same time. No human can do that. Such platforms could promote inclusivity and boost voter turnout among digitally literate populations,” he said. Fakoya added that AI’s generative capabilities could be explored to produce educational content in locally relevant languages. However, he warned that the technology remains a double-edged sword capable of producing what he described as “digitally true but physically false” content that can mislead voters. He explained that such manipulated content could be massively targeted and hyper-personalised through algorithms designed to exploit individual biases. “A voter sees a video of violence at their polling unit on election morning; a party loyalist is confronted with an image of their candidate embracing a rival; a religious follower views a manipulated picture of their spiritual leader, an image calculated to incite violence,” he said. According to him, such tactics could suppress voter participation, inflame ethnic and religious tensions, and overstretch security agencies. To mitigate these risks, Fakoya called for legislation compelling social media platforms to deploy AI detection tools capable of clearly labelling content generated or manipulated by artificial intelligence. Citing a Yoruba proverb that “the same white man who made the pencil also made the eraser,” he stressed the need for transparency measures, rapid fact-checking systems and aggressive digital literacy campaigns. According to her, while AI can enhance transparency and strengthen electoral processes, it has limitations in directly addressing entrenched human practices such as the inducement of voters. “AI can support transparency and improve monitoring, but it cannot, on its own, eliminate vote buying. Human complicity and weaknesses in the electoral framework are issues technology alone cannot fix,” she stressed. On his part, the Dean of Social Sciences, University of Lagos, Prof Adelaja Odukoya, cautioned that while the misuse of AI is a global phenomenon, Nigeria’s fragile institutions heighten the risks. “AI is not an autonomous actor; it is an amplifier,” he said, adding that its impact would depend largely on existing political power struggles and institutional strength. Also speaking, an Information Technology expert, Folajimi Fakoya, urged the Federal Government, the Independent National Electoral Commission and other stakeholders to leverage AI’s potential while guarding against its risks. Related News Ex-INEC officials warn of flaws in Electoral Act 2026 Kwara unveils coding, digital literacy programme for schools 2027: ADC rejects revised INEC schedule, says it's designed for Tinubu’s re‑election Fakoya highlighted AI’s transformative potential in areas such as voter registration and education, which he described as foundational to credible elections. He said specially designed chatbots could revolutionise voter education by providing real-time guidance to millions of citizens simultaneously. “A single chatbot can respond to over one million queries at the same time. No human can do that. Such platforms could promote inclusivity and boost voter turnout among digitally literate populations,” he said. Fakoya added that AI’s generative capabilities could be explored to produce educational content in locally relevant languages. However, he warned that the technology remains a double-edged sword capable of producing what he described as “digitally true but physically false” content that can mislead voters. He explained that such manipulated content could be massively targeted and hyper-personalised through algorithms designed to exploit individual biases. “A voter sees a video of violence at their polling unit on election morning; a party loyalist is confronted with an image of their candidate embracing a rival; a religious follower views a manipulated picture of their spiritual leader, an image calculated to incite violence,” he said. According to him, such tactics could suppress voter participation, inflame ethnic and religious tensions, and overstretch security agencies. To mitigate these risks, Fakoya called for legislation compelling social media platforms to deploy AI detection tools capable of clearly labelling content generated or manipulated by artificial intelligence. Citing a Yoruba proverb that “the same white man who made the pencil also made the eraser,” he stressed the need for transparency measures, rapid fact-checking systems and aggressive digital literacy campaigns. “AI can support transparency and improve monitoring, but it cannot, on its own, eliminate vote buying. Human complicity and weaknesses in the electoral framework are issues technology alone cannot fix,” she stressed. On his part, the Dean of Social Sciences, University of Lagos, Prof Adelaja Odukoya, cautioned that while the misuse of AI is a global phenomenon, Nigeria’s fragile institutions heighten the risks. “AI is not an autonomous actor; it is an amplifier,” he said, adding that its impact would depend largely on existing political power struggles and institutional strength. Also speaking, an Information Technology expert, Folajimi Fakoya, urged the Federal Government, the Independent National Electoral Commission and other stakeholders to leverage AI’s potential while guarding against its risks. Related News Ex-INEC officials warn of flaws in Electoral Act 2026 Kwara unveils coding, digital literacy programme for schools 2027: ADC rejects revised INEC schedule, says it's designed for Tinubu’s re‑election Fakoya highlighted AI’s transformative potential in areas such as voter registration and education, which he described as foundational to credible elections. He said specially designed chatbots could revolutionise voter education by providing real-time guidance to millions of citizens simultaneously. “A single chatbot can respond to over one million queries at the same time. No human can do that. Such platforms could promote inclusivity and boost voter turnout among digitally literate populations,” he said. Fakoya added that AI’s generative capabilities could be explored to produce educational content in locally relevant languages. However, he warned that the technology remains a double-edged sword capable of producing what he described as “digitally true but physically false” content that can mislead voters. He explained that such manipulated content could be massively targeted and hyper-personalised through algorithms designed to exploit individual biases. “A voter sees a video of violence at their polling unit on election morning; a party loyalist is confronted with an image of their candidate embracing a rival; a religious follower views a manipulated picture of their spiritual leader, an image calculated to incite violence,” he said. According to him, such tactics could suppress voter participation, inflame ethnic and religious tensions, and overstretch security agencies. To mitigate these risks, Fakoya called for legislation compelling social media platforms to deploy AI detection tools capable of clearly labelling content generated or manipulated by artificial intelligence. Citing a Yoruba proverb that “the same white man who made the pencil also made the eraser,” he stressed the need for transparency measures, rapid fact-checking systems and aggressive digital literacy campaigns. On his part, the Dean of Social Sciences, University of Lagos, Prof Adelaja Odukoya, cautioned that while the misuse of AI is a global phenomenon, Nigeria’s fragile institutions heighten the risks. “AI is not an autonomous actor; it is an amplifier,” he said, adding that its impact would depend largely on existing political power struggles and institutional strength. Also speaking, an Information Technology expert, Folajimi Fakoya, urged the Federal Government, the Independent National Electoral Commission and other stakeholders to leverage AI’s potential while guarding against its risks. Related News Ex-INEC officials warn of flaws in Electoral Act 2026 Kwara unveils coding, digital literacy programme for schools 2027: ADC rejects revised INEC schedule, says it's designed for Tinubu’s re‑election Fakoya highlighted AI’s transformative potential in areas such as voter registration and education, which he described as foundational to credible elections. He said specially designed chatbots could revolutionise voter education by providing real-time guidance to millions of citizens simultaneously. “A single chatbot can respond to over one million queries at the same time. No human can do that. Such platforms could promote inclusivity and boost voter turnout among digitally literate populations,” he said. Fakoya added that AI’s generative capabilities could be explored to produce educational content in locally relevant languages. However, he warned that the technology remains a double-edged sword capable of producing what he described as “digitally true but physically false” content that can mislead voters. He explained that such manipulated content could be massively targeted and hyper-personalised through algorithms designed to exploit individual biases. “A voter sees a video of violence at their polling unit on election morning; a party loyalist is confronted with an image of their candidate embracing a rival; a religious follower views a manipulated picture of their spiritual leader, an image calculated to incite violence,” he said. According to him, such tactics could suppress voter participation, inflame ethnic and religious tensions, and overstretch security agencies. To mitigate these risks, Fakoya called for legislation compelling social media platforms to deploy AI detection tools capable of clearly labelling content generated or manipulated by artificial intelligence. Citing a Yoruba proverb that “the same white man who made the pencil also made the eraser,” he stressed the need for transparency measures, rapid fact-checking systems and aggressive digital literacy campaigns. “AI is not an autonomous actor; it is an amplifier,” he said, adding that its impact would depend largely on existing political power struggles and institutional strength. Also speaking, an Information Technology expert, Folajimi Fakoya, urged the Federal Government, the Independent National Electoral Commission and other stakeholders to leverage AI’s potential while guarding against its risks. Related News Ex-INEC officials warn of flaws in Electoral Act 2026 Kwara unveils coding, digital literacy programme for schools 2027: ADC rejects revised INEC schedule, says it's designed for Tinubu’s re‑election Fakoya highlighted AI’s transformative potential in areas such as voter registration and education, which he described as foundational to credible elections. He said specially designed chatbots could revolutionise voter education by providing real-time guidance to millions of citizens simultaneously. “A single chatbot can respond to over one million queries at the same time. No human can do that. Such platforms could promote inclusivity and boost voter turnout among digitally literate populations,” he said. Fakoya added that AI’s generative capabilities could be explored to produce educational content in locally relevant languages. However, he warned that the technology remains a double-edged sword capable of producing what he described as “digitally true but physically false” content that can mislead voters. He explained that such manipulated content could be massively targeted and hyper-personalised through algorithms designed to exploit individual biases. “A voter sees a video of violence at their polling unit on election morning; a party loyalist is confronted with an image of their candidate embracing a rival; a religious follower views a manipulated picture of their spiritual leader, an image calculated to incite violence,” he said. According to him, such tactics could suppress voter participation, inflame ethnic and religious tensions, and overstretch security agencies. To mitigate these risks, Fakoya called for legislation compelling social media platforms to deploy AI detection tools capable of clearly labelling content generated or manipulated by artificial intelligence. Citing a Yoruba proverb that “the same white man who made the pencil also made the eraser,” he stressed the need for transparency measures, rapid fact-checking systems and aggressive digital literacy campaigns. Also speaking, an Information Technology expert, Folajimi Fakoya, urged the Federal Government, the Independent National Electoral Commission and other stakeholders to leverage AI’s potential while guarding against its risks. Related News Ex-INEC officials warn of flaws in Electoral Act 2026 Kwara unveils coding, digital literacy programme for schools 2027: ADC rejects revised INEC schedule, says it's designed for Tinubu’s re‑election Fakoya highlighted AI’s transformative potential in areas such as voter registration and education, which he described as foundational to credible elections. He said specially designed chatbots could revolutionise voter education by providing real-time guidance to millions of citizens simultaneously. “A single chatbot can respond to over one million queries at the same time. No human can do that. Such platforms could promote inclusivity and boost voter turnout among digitally literate populations,” he said. Fakoya added that AI’s generative capabilities could be explored to produce educational content in locally relevant languages. However, he warned that the technology remains a double-edged sword capable of producing what he described as “digitally true but physically false” content that can mislead voters. He explained that such manipulated content could be massively targeted and hyper-personalised through algorithms designed to exploit individual biases. “A voter sees a video of violence at their polling unit on election morning; a party loyalist is confronted with an image of their candidate embracing a rival; a religious follower views a manipulated picture of their spiritual leader, an image calculated to incite violence,” he said. According to him, such tactics could suppress voter participation, inflame ethnic and religious tensions, and overstretch security agencies. To mitigate these risks, Fakoya called for legislation compelling social media platforms to deploy AI detection tools capable of clearly labelling content generated or manipulated by artificial intelligence. Citing a Yoruba proverb that “the same white man who made the pencil also made the eraser,” he stressed the need for transparency measures, rapid fact-checking systems and aggressive digital literacy campaigns. Fakoya highlighted AI’s transformative potential in areas such as voter registration and education, which he described as foundational to credible elections. He said specially designed chatbots could revolutionise voter education by providing real-time guidance to millions of citizens simultaneously. “A single chatbot can respond to over one million queries at the same time. No human can do that. Such platforms could promote inclusivity and boost voter turnout among digitally literate populations,” he said. Fakoya added that AI’s generative capabilities could be explored to produce educational content in locally relevant languages. However, he warned that the technology remains a double-edged sword capable of producing what he described as “digitally true but physically false” content that can mislead voters. He explained that such manipulated content could be massively targeted and hyper-personalised through algorithms designed to exploit individual biases. “A voter sees a video of violence at their polling unit on election morning; a party loyalist is confronted with an image of their candidate embracing a rival; a religious follower views a manipulated picture of their spiritual leader, an image calculated to incite violence,” he said. According to him, such tactics could suppress voter participation, inflame ethnic and religious tensions, and overstretch security agencies. To mitigate these risks, Fakoya called for legislation compelling social media platforms to deploy AI detection tools capable of clearly labelling content generated or manipulated by artificial intelligence. Citing a Yoruba proverb that “the same white man who made the pencil also made the eraser,” he stressed the need for transparency measures, rapid fact-checking systems and aggressive digital literacy campaigns. He said specially designed chatbots could revolutionise voter education by providing real-time guidance to millions of citizens simultaneously. “A single chatbot can respond to over one million queries at the same time. No human can do that. Such platforms could promote inclusivity and boost voter turnout among digitally literate populations,” he said. Fakoya added that AI’s generative capabilities could be explored to produce educational content in locally relevant languages. However, he warned that the technology remains a double-edged sword capable of producing what he described as “digitally true but physically false” content that can mislead voters. He explained that such manipulated content could be massively targeted and hyper-personalised through algorithms designed to exploit individual biases. “A voter sees a video of violence at their polling unit on election morning; a party loyalist is confronted with an image of their candidate embracing a rival; a religious follower views a manipulated picture of their spiritual leader, an image calculated to incite violence,” he said. According to him, such tactics could suppress voter participation, inflame ethnic and religious tensions, and overstretch security agencies. To mitigate these risks, Fakoya called for legislation compelling social media platforms to deploy AI detection tools capable of clearly labelling content generated or manipulated by artificial intelligence. Citing a Yoruba proverb that “the same white man who made the pencil also made the eraser,” he stressed the need for transparency measures, rapid fact-checking systems and aggressive digital literacy campaigns. “A single chatbot can respond to over one million queries at the same time. No human can do that. Such platforms could promote inclusivity and boost voter turnout among digitally literate populations,” he said. Fakoya added that AI’s generative capabilities could be explored to produce educational content in locally relevant languages. However, he warned that the technology remains a double-edged sword capable of producing what he described as “digitally true but physically false” content that can mislead voters. He explained that such manipulated content could be massively targeted and hyper-personalised through algorithms designed to exploit individual biases. “A voter sees a video of violence at their polling unit on election morning; a party loyalist is confronted with an image of their candidate embracing a rival; a religious follower views a manipulated picture of their spiritual leader, an image calculated to incite violence,” he said. According to him, such tactics could suppress voter participation, inflame ethnic and religious tensions, and overstretch security agencies. To mitigate these risks, Fakoya called for legislation compelling social media platforms to deploy AI detection tools capable of clearly labelling content generated or manipulated by artificial intelligence. Citing a Yoruba proverb that “the same white man who made the pencil also made the eraser,” he stressed the need for transparency measures, rapid fact-checking systems and aggressive digital literacy campaigns. Fakoya added that AI’s generative capabilities could be explored to produce educational content in locally relevant languages. However, he warned that the technology remains a double-edged sword capable of producing what he described as “digitally true but physically false” content that can mislead voters. He explained that such manipulated content could be massively targeted and hyper-personalised through algorithms designed to exploit individual biases. “A voter sees a video of violence at their polling unit on election morning; a party loyalist is confronted with an image of their candidate embracing a rival; a religious follower views a manipulated picture of their spiritual leader, an image calculated to incite violence,” he said. According to him, such tactics could suppress voter participation, inflame ethnic and religious tensions, and overstretch security agencies. To mitigate these risks, Fakoya called for legislation compelling social media platforms to deploy AI detection tools capable of clearly labelling content generated or manipulated by artificial intelligence. Citing a Yoruba proverb that “the same white man who made the pencil also made the eraser,” he stressed the need for transparency measures, rapid fact-checking systems and aggressive digital literacy campaigns. However, he warned that the technology remains a double-edged sword capable of producing what he described as “digitally true but physically false” content that can mislead voters. He explained that such manipulated content could be massively targeted and hyper-personalised through algorithms designed to exploit individual biases. “A voter sees a video of violence at their polling unit on election morning; a party loyalist is confronted with an image of their candidate embracing a rival; a religious follower views a manipulated picture of their spiritual leader, an image calculated to incite violence,” he said. According to him, such tactics could suppress voter participation, inflame ethnic and religious tensions, and overstretch security agencies. To mitigate these risks, Fakoya called for legislation compelling social media platforms to deploy AI detection tools capable of clearly labelling content generated or manipulated by artificial intelligence. Citing a Yoruba proverb that “the same white man who made the pencil also made the eraser,” he stressed the need for transparency measures, rapid fact-checking systems and aggressive digital literacy campaigns. He explained that such manipulated content could be massively targeted and hyper-personalised through algorithms designed to exploit individual biases. “A voter sees a video of violence at their polling unit on election morning; a party loyalist is confronted with an image of their candidate embracing a rival; a religious follower views a manipulated picture of their spiritual leader, an image calculated to incite violence,” he said. According to him, such tactics could suppress voter participation, inflame ethnic and religious tensions, and overstretch security agencies. To mitigate these risks, Fakoya called for legislation compelling social media platforms to deploy AI detection tools capable of clearly labelling content generated or manipulated by artificial intelligence. Citing a Yoruba proverb that “the same white man who made the pencil also made the eraser,” he stressed the need for transparency measures, rapid fact-checking systems and aggressive digital literacy campaigns. “A voter sees a video of violence at their polling unit on election morning; a party loyalist is confronted with an image of their candidate embracing a rival; a religious follower views a manipulated picture of their spiritual leader, an image calculated to incite violence,” he said. According to him, such tactics could suppress voter participation, inflame ethnic and religious tensions, and overstretch security agencies. To mitigate these risks, Fakoya called for legislation compelling social media platforms to deploy AI detection tools capable of clearly labelling content generated or manipulated by artificial intelligence. Citing a Yoruba proverb that “the same white man who made the pencil also made the eraser,” he stressed the need for transparency measures, rapid fact-checking systems and aggressive digital literacy campaigns. According to him, such tactics could suppress voter participation, inflame ethnic and religious tensions, and overstretch security agencies. To mitigate these risks, Fakoya called for legislation compelling social media platforms to deploy AI detection tools capable of clearly labelling content generated or manipulated by artificial intelligence. Citing a Yoruba proverb that “the same white man who made the pencil also made the eraser,” he stressed the need for transparency measures, rapid fact-checking systems and aggressive digital literacy campaigns. To mitigate these risks, Fakoya called for legislation compelling social media platforms to deploy AI detection tools capable of clearly labelling content generated or manipulated by artificial intelligence. Citing a Yoruba proverb that “the same white man who made the pencil also made the eraser,” he stressed the need for transparency measures, rapid fact-checking systems and aggressive digital literacy campaigns. Citing a Yoruba proverb that “the same white man who made the pencil also made the eraser,” he stressed the need for transparency measures, rapid fact-checking systems and aggressive digital literacy campaigns.