Privacy :
Not Right ?
===============
Here is why :
Attorney General , KK Venugopal submitted to the 9
judge bench:
“ There is no fundamental right to privacy ,
and even if it is assumed as a fundamental right , it is multifaceted
Every
facet can’t be ipso facto considered a fundamental right
Informational
privacy could not be a right to privacy, and it could not ever be a fundamental
right
(
source : DNA / 28 July )
The
bench drew Sundaram’s attention to present day reality, when rapid technological
advance is making individual privacy increasingly vulnerable
“ Do we have a robust data protection
regime to protect and secure personal information ? “ it asked, indicating its
willingness to look at privacy afresh without being burdened by past rulings
“ If we accept privacy as a constitutional
right , it will have to be part of personal liberty and right to life
guaranteed under article 21 of the constitution” , it said
( source
: Times of India / 28 July )
Senior
advocate Sundaram, representing Maharashtra government told the court that,
“ Privacy can mean a lot of things .
Moreover , right to privacy was considered by the constitution makers but they
decided to drop it as a fundamental right “
(
source : Hindustan Times / 28 July )
All of this sounds so
much rational !
Especially
, the recognition that the “ March
of Technology “ , will render futile
, all arguments re: “ Right to Privacy “
What
could have brought about this changed perception on the part of Hon Judges ?
# Following news ?
Artificial intelligence ' Judge ' developed by UCL computer scientists
( source : http://www.ucl.ac.uk/news/news-articles/1016/241016-AI-predicts-outcomes-human-rights-trials
/ 24 Oct 2016 )
Artificial
intelligence software that can find patterns in
highly complex decisions is being used to predict our taste in films, TV
shows and music with ever-increasing accuracy. And now, after a breakthrough
study by a group of British scientists, it could be used to predict the outcome of trials.
Software
that is able to weigh up legal
evidence and moral questions of
right and wrong has been devised by computer scientists at University College
London, and used to accurately
predict the result in hundreds of
real life cases.
The
AI “judge” has reached the same
verdicts as judges at the European court of human
rights in almost four in five cases involving
torture, degrading treatment and privacy.
The
algorithm examined English language data sets for 584 cases relating to torture
and degrading treatment, fair trials and privacy.
In each case, the software analysed the information
and made its own judicial decision. In
79% of those assessed, the AI verdict was the same as the one delivered by the
court.
Dr
Nikolaos Aletras, the lead researcher from UCL’s department of computer
science, said: “We don’t see AI
replacing judges or lawyers, but
we think they’d find it useful for rapidly identifying patterns in cases that lead to certain
outcomes.
“It
could also be a valuable tool for highlighting which cases are most likely to
be violations of the European convention on human rights.” An equal number of
“violation” and “non-violation” cases were chosen for the study.
In the course of developing the
programme the team found that judgments of the European court of human rights
depends more on non-legal facts than purely legal arguments.
This suggests that the court’s judges
are more legal theory “realists” than “formalists”.
The same is true of other high level
courts, such as the US supreme court, according to previous studies.
The
most reliable factors for predicting European court of human rights decisions
were found to be the language used as well as the topics and circumstances mentioned
in the case texts.
One
of these days , expect some Indian Start-up in the LEGAL DOMAIN to
upload all the past Orders / Judgements of Hon Judges ( of 9 member bench )
into this algorithm
{
They may want to first look up : https://peerj.com/articles/cs-93/
}
Will
the outcome read ?
PRIVACY
LEFT : NO MORE RIGHT
31
July 2017
www.hemenparekh.in / blogs