This Article is written by Aditi Dubey of Soa National Institute of Law.

Evidence decides whether a case wins or loses in a court. Long time ago, evidence meant in paper files and witness talks. But now courts get tons of digital stuff like CCTV captured videos, WhatsApp messages, emails, phone location data and social media posts. Due to this lawyers and judges went buried under work. As they sort files by hand, check if it is real or fake and track from police station to trail. Due to this, mistakes happen like files get lost, mixed up or called tampered. Currently in India, over 4.5 crore cases wait in courts which is making delay worse.
With the inclusion of AI and legal tech the problem is getting fixed. As AI scans huge data fast, tags key parts, blurs private info, spots fake videos and links proofs to timelines. Tools like C-DAC’s DEMS let police store evidence safely with logs no one can change and others turn audio to text or predict case strength. But the issues like algorithmic bias, lack of transparency and no clear rules remain as they were previously. While India’s BSA,2023(Bharatiya Sakshya Adhiniyam, 2023) strengthens the recognition of digital evidence. This article covers old ways’ problems, AI tools, court cases, pluses, minuses, laws and future steps.
Traditional evidence challenges
If evidence is handled the old ways, it brings a lot of headaches. The lawyers and the court staff shift from paper documents, photos to now digital files like videos and emails. This manual sorting takes days or even weeks. They check each item by hand to see if it fits the case, mark dates, list details. One small mistake like wrong labels or lost pages can ruin a trial. In India, the Indian Evidence Act of 1872 sets the basic rules. In India, the Indian Evidence Act of 1872 set basic rules. It added electronic records later but also Section 65B made things strict. This section explains you need a special certificate to prove digital evidence came from a real device and stayed unchanged. The certificate must name the person who got the data and describe the machine. Courts often reject files without it, even if it is important. Cases like murder trials or fraud drag on because proofs get thrown out.
The Bharatiya Sakshya Adhiniyam 2023 (BSA) fixes some gaps. It calls electronic records “documents” if certified right. It covers phones, laptops, cloud storage and many more. But the chain of custody still fails. This means tracking evidence from crime scene grab, police storage, to the judge’s bench. Files go missing in transfers, get mixed up in busy stations, or face tampering claims. No tech means no sure proof of safety. India’s 4.5 crore pending cases grow because of these slips.
AI tools in evidence handling
AI tools take the heavy lifting out of evidence work. They automate dull jobs that eat up time. First, tagging sorts of files smartly. AI scans videos, photos, or chats and labels key bits like a face in a crowd or a name in emails using machine learning. No more hand-searching thousands of pages. Redaction hides private details fast. Tools blur faces, plates, or addresses in footage to follow privacy rules. This keeps victim info safe before court. Transcription turns audio clips from witness talks or calls into text. Natural language processing makes it searchable, so lawyers find quotes in seconds.
Deepfake detection checks fakes. AI spots odd pixel jumps, voice mismatches, or lip-sync errors in edited videos. This stops crooks from planting false proofs. Metadata pull-out grabs hidden facts like shoot time, GPS spots, or edit history from files. In India, C-DAC’s Digital Evidence Management System (DEMS) shines. Police upload CCTV, mobiles, or forensics data to a secure cloud. It creates hash codes to lock integrity, no changes allowed. Every view or share gets logged with who, when and why. Prosecutors get tamper-proof links for trials.
Globally, Thomson Reuters’ Case Notebook ties exhibit to case timelines. It predicts outcomes from old rulings. Relativity handles e-discovery for big cases, reviewing docs with 90% hit rates. Veritone aids cops with body-cam scans. These cut review time from weeks to hours, letting focus stay on justice.
Landmark case laws
Indian courts have shaped rules for digital evidence through key judgments. These cases guide how AI tools fit into proof handling today. Each ruling stresses reliability and clear steps.
Anvar P.V. v. P.K. Basheer (2014)
In this election dispute, the Supreme Court set a firm rule. A CD with video from a phone lacked a Section 65B certificate. The court said electronic records cannot go in without it. The certificate must name the device operator and swear no changes happened. This overruled past loose practices. Now, courts check this first for any digital file like CCTV or emails. Without it, proof gets rejected outright.
Arjun Panditrao Khotkar v. Kailash Kushanrao (2020)
This case reviewed Anvar. The Supreme Court clarified the certificate need not come at filing time. Oral evidence can explain it later if unavailable initially. But the proof must arrive before the trial ends. This balances strictness with real world needs in fast cases.
State v. Devangana Kalita (Delhi Riots, 2020)
Police matched suspects via AI facial recognition on riot footage. Delhi High Court doubted 99% claims due to night light and angles. The court demanded error rates and test data. AI showed promise but needs human backup.
Jaswinder Singh v. State of Punjab (2023)
Court admitted mobile tower data. Full chain logs from collection to court proved no tampering. Highlights need for an unbroken path in digital proofs.
Global courts like in the US reject unexplained AI. India follows suit for trust.
Benefits role of AI and legal tech
AI and legal tech offer substantial advantages in evidence management under Indian law. They markedly enhance efficiency by obviating manual perusal of voluminous records. Automated search algorithms expeditiously retrieve pertinent electronic documents, such as intercepted communications or forensic multimedia, thereby expediting investigation timelines and trial proceedings pursuant to Order VII Rule 14 of the Code of Civil Procedure, 1908.
Economically, these innovations curtail expenditure on human resources for classification and redaction. E-discovery costs, often prohibitive under Section 65B compliance, diminish by 30 to 50 percent, rendering justice accessible to indigent litigants as enshrined in Article 39A of the Constitution.Evidentiary integrity strengthens considerably. Predictive analytics, grounded in precedents from the Supreme Court and High Courts, prognosticate admissibility and probative value, mitigating risks of rejection akin to those in Anvar P.V. v. P.K. Basheer. Compliance with the Digital Personal Data Protection Act, 2023, is automated via intelligent redaction, safeguarding fiduciary data. Immutable blockchain ledgers fortify chain of custody, forestalling Section 114 illustrations (g) presumptions of tampering.
Judicial pendency, exceeding 4.5 crore cases per National Judicial Data Grid, alleviates through SUPACE assisted scrutiny. Ergo, AI fosters expeditious, equitable dispensation of justice consonant with Article 21.
Challenges and ethics
AI tools in evidence management face serious roadblocks. Bias tops the list. Training data often comes from city crowds or light skinned faces. This makes facial recognition fail on rural Indians or dark skin tones. Courts may reject such proofs as unfair, hurting equal justice under Article 14 of the Constitution.The black box problem hides how AI works. Judges cannot see why it picks one video clip over another. This clashes with fair trial rights in Article 21. Without clear steps, like in Anvar P.V. case, evidence gets tossed out.
Costs add burden. Small lawyers or public prosecutors cannot afford expert checks or fancy software. Rich parties gain an edge, against free legal aid goals. No special law exists for AI evidence. Bharatiya Sakshya Adhiniyam covers digital files but skips AI made reports or predictions.
Privacy risks grow. Hacks can leak victim data despite DPDP Act rules. Cyber attacks hit police servers holding proofs. Judges lack training too. Many struggle with tech terms or tools like SUPACE. They need workshops to spot fakes or check AI outputs.
Framework of laws and regulations
India’s laws slowly catch up with AI in evidence. The Bharatiya Sakshya Adhiniyam 2023 (BSA) marks a big step. It renames Section 65B rules and treats all electronic records as documents. Certificates must detail how data formed on devices like phones or servers. Courts accept them if chain logs prove no changes. This covers CCTV, emails and mobiles better than the old Evidence Act.
The Information Technology Act 2000 fights cyber tampering under Section 66. It sets rules for digital signatures and secure storage. Digital Personal Data Protection Act 2023 guards privacy during evidence handling. E-Courts project Phase III rolls out SUPACE for judge research and SUVAS for language translation of records. These tools help scan proofs fast.
PIB guidelines push safe AI use in courts with human checks. Still, gaps remain. No standalone AI law sets standards for bias tests or explainability. Courts demand oversight, as in the Arjun Panditrao case. An AI specific act could mandate audits and training to match global rules.
Future directions
AI in evidence management heads toward smarter, safer use. Blockchain will link with tools like DEMS for unbreakable chains. Every step from police grab to court show gets timestamped and hashed. No one can fake or delete logs, building full trust. Quantum encryption will guard against future hacks on big data stores.
Training stands key. Bar councils must teach 50 lakh lawyers AI basics, spot bias and check outputs. Judges need hands-on workshops with SUPACE and deepfake tools. Law schools add courses on legal tech ethics. Amend Bharatiya Sakshya Adhiniyam for AI clauses. Add rules like Daubert tests for reliability, mandatory explainability and bias audits. Create a National Evidence Tech Authority to set standards and certify tools.
Policy wise, blend tech with human judgment. Fund free AI access for public prosecutors. Partner global firms for India specific models. Roll out nationwide DEMS by 2027. Balanced steps turn AI into a justice booster, not risk.
Conclusion
AI and legal tech offer powerful fixes for India’s evidence woes. Tools like C-DAC’s DEMS lock proofs with unbreakable logs, while AI scans videos, spots deepfakes and builds timelines fast. Cases from Anvar P.V. to Delhi riots show courts want reliability certificates, chain logs, human checks. Bharatiya Sakshya Adhiniyam 2023 strengthens digital rules, but skips AI bias, black boxes and machine-made reports. With 4.5 crore cases pending, speed matters, yet costs and training gaps hold back small lawyers and judges.
The path forward demands balance. Train lawyers on SUPACE and blockchain. Add AI clauses to BSA for explainability tests. Fund nationwide DEMS rollout. Let machines handle sorting, humans judge truth. Smart rules turn tech into justice’s ally, clearing backlogs and building trust from police stations to the Supreme Court.
Frequently Asked Questions
Does BSA 2023 require Section 65B certificates for AI evidence?
Yes, for base electronic records like CCTV footage. AI outputs need extra chain logs plus human oversight affidavits. Anvar P.V. (2014) and Arjun Panditrao (2020) rulings reject unverified AI without error rates or validation proof.
How did State v. Devangana Kalita impact AI evidence rules?
Delhi High Court rejected police facial recognition from riot footage. The court found 99% accuracy claims unproven due to poor lighting and angles. Ordered test data plus error metrics. Ruled AI needs human backup plus Section 63 BSA transparency.
Can blockchain like C-DAC DEMS replace traditional chains of custody?
Partially. Immutable hashes prove no tampering like Jaswinder Singh mobile logs. Courts still demand human certification. Full admissibility requires unbroken procedural trail from crime scene seizure to courtroom trial.
How should Indian courts handle AI bias and black box problems?
Use diverse training data plus mandatory audits. Require explainability disclosures like EU AI Act. Apply Daubert-style judicial gatekeeping tests. SUPACE training ensures Article 14 equality plus Article 21 fair trial rights. Prevents evidence rejections.
Will AI fully replace human judgment in checking evidence?
No way courts like in Devangana Kalita made it clear that AI tools need human backup to verify accuracy, especially with tricky stuff like poor video quality. BSA 2023 still demands that final call from lawyers and judges to avoid miscarriages.
References
- State v. Devangana Kalita (Delhi Riots, 2020)
Delhi HC on AI Facial Recognition, https://delhidistrictcourts.nic.in/decisions. Last visited January 24, 2026. DelhiCourts - Jaswinder Singh v. State of Punjab (2023)
Mobile Tower Data Chain of Custody, https://highcourtchd.gov.in/?mod=judis. Last visited January 24, 2026. P&HHC - C-DAC DEMS, https://www.cdac.in/index.aspx?id=product_details&productId=DigitalEvidenceManagementSystem(DEMS). Last visited January 24, 2026. C-DAC
- Bharatiya Sakshya Adhiniyam 2023, Ratanlal & Dhirajlal, https://store.lexisnexis.com/en-in/products/ratanlal-amp-dhirajlal-the-bharatiya-sakshya-adhiniyam-2023-insku9789395116831.html. Last visited January 24, 2026. LexisNexis
- National Judicial Data Grid, https://njdg.ecourts.gov.in. Last visited January 24, 2026. NJDG
- AI Digital Evidence Study, https://ijirl.com/wp-content/uploads/2025/04/ADMISSIBILITY-OF-AI-REVIEWED-DIGITAL-EVIDENCE-IN-LEGAL-INVESTIGATIONS.pdf. Last visited January 24, 2026. IJIRL
- Thomson Reuters Evidence Tools, https://legal.thomsonreuters.com/en/legal/evidence/electronic-evidence. Last visited January 24, 2026. ThomsonReuters


