A critical review of the politics of artificial intelligent machines, alienation and the existential risk threat to america’s labour force

Search for the full text: Google | DOI

Damasevicius R., Assibong P., Adewumi A., Maskeliunas R., Misra S., Wogu I.A. (2018) A critical review of the politics of artificial intelligent machines, alienation and the existential risk threat to america’s labour force, in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 10963 LNCS, 217-232.

Nine Ways to Bias Open-Source Artificial General Intelligence Toward Friendliness

Search for the full text: Google | DOI

Goertzel B., Pitt J. (2014) Nine Ways to Bias Open-Source Artificial General Intelligence Toward Friendliness, in Intelligence Unbound: The Future of Uploaded and Machine Minds [volume not available], 61-89.

Who Knows Anything about Anything about AI?

Search for the full text: Google | DOI

Óhéigeartaigh S., Armstrong S. (2014) Who Knows Anything about Anything about AI?, in Intelligence Unbound: The Future of Uploaded and Machine Minds [volume not available], 46-60.

Is IR going extinct?

Search for the full text: Google | DOI

Mitchell A. (2017) Is IR going extinct?, in European Journal of International Relations 23, 3-25.

System approach to management of catastrophic risks

Search for the full text: Google | DOI

Ermoliev Y.M., Ermolieva T.Y., MacDonald G.J., Norkin V.I., Amendola A. (2000) System approach to management of catastrophic risks, in European Journal of Operational Research 122, 452-460.

The doomsday argument

Search for the full text: Google | DOI

Leslie J. (1992) The doomsday argument, in The Mathematical Intelligencer 14, 48-51.

Value of global catastrophic risk (GCR) information: Cost-effectiveness-based approach for GCR reduction

Search for the full text: Google | DOI

Barrett A.M. (2017) Value of global catastrophic risk (GCR) information: Cost-effectiveness-based approach for GCR reduction, in Decision Analysis 14, 187-203.

Moral bioenhancement and agential risks: Good and bad outcomes

Search for the full text: Google | DOI

Torres P. (2017) Moral bioenhancement and agential risks: Good and bad outcomes, in Bioethics 31, 691-696.

Why reason matters: Connecting research on human reason to the challenges of the anthropocene

Search for the full text: Google | DOI

Barr N., Pennycook G. (2018) Why reason matters: Connecting research on human reason to the challenges of the anthropocene, in The New Reflectionism in Cognitive Psychology: Why Reason Matters [volume not available], 119-142.

Reconciliation between factions focused on near-term and long-term artificial intelligence

Search for the full text: Google | DOI

Baum S.D. (2018) Reconciliation between factions focused on near-term and long-term artificial intelligence, in AI and Society 33, 565-572.

Previous page Next page

Page number: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121, 122, 123, 124, 125, 126, 127, 128, 129, 130, 131, 132, 133, 134, 135, 136, 137, 138, 139, 140, 141, 142, 143, 144, 145, 146, 147, 148, 149, 150, 151, 152, 153, 154, 155, 156, 157, 158, 159, 160, 161, 162, 163, 164, 165, 166, 167, 168, 169, 170, 171, 172, 173, 174, 175, 176, 177, 178, 179, 180, 181, 182, 183, 184, 185, 186, 187, 188, 189, 190, 191, 192, 193, 194, 195, 196, 197, 198, 199, 200, 201, 202, 203, 204, 205, 206, 207, 208, 209, 210, 211, 212, 213, 214, 215, 216, 217, 218, 219, 220, 221, 222, 223, 224, 225, 226, 227, 228, 229, 230, 231, 232, 233, 234, 235, 236, 237, 238, 239, 240, 241, 242, 243, 244, 245, 246, 247, 248, 249, 250, 251, 252, 253, 254, 255, 256, 257, 258, 259, 260, 261, 262, 263, 264, 265, 266, 267, 268