Hacker Newsnew | past | comments | ask | show | jobs | submit | datan3rd's commentslogin

Agreed, seems like a very elegant method to create your own black budget.


and this is exactly why its not. I think we can all agree that the hollywood idea of competent people behind the scenes in positions of power has been proven wrong again and again. Any 4d chess move is the accidental consequence of a move so dumb that it becomes unimaginable to us.


There have been similar schemes, like those privacy phones that were directly ran by the government.


Exactly what a competent person behind the scenes would say.


we certainly do not agree on this.


Def seems like it's involved in whatever is the next evolution of BCCI.

Can read more about it in this book. I'm not sure where to find the definitive account.

https://www.thriftbooks.com/w/american-exception-empire-and-...


Whitney Webb's new book also has a lot of detail on BCCI

https://www.goodreads.com/book/show/51074723-one-nation-unde...


This looks very interesting. Thanks for the recommendation!


  with persons as (select * from Person)  
  select Name from persons  
  where Birthdate < '2000-01-01


Where is the assignment to a variable? Where can you construct a query using a variable in table/query position? That's the whole point of being first class and composable, a query becomes like any other value so you should be able to parameterize any query by another query assigned to a variable that may have been set inside an if-statement, or accepted as a parameter to a stored procedure. You know, the same kinds of composition we see in ordinary programming languages.


  create table x as (select * from person);
  select name from x where ...;
there you go, just configure your editor to display "create table x" as "declare x = " ;)

or even a version with lazy evaluation:

  create view x as (select * from person);
  select name from x where ...;


You're still not getting it. First-class status means that anywhere a value or variable can be used, a query or table should also be able to appear, and vice versa. This means a table or query can appear as a return type, a parameter to a stored procedure or query, a variable, and so on.

SQL just does not have this, it instead has 15 different second class ways to handle tables and queries that try to make up for the fact that they are not first-class values. These include CTEs, table valued functions, views, etc.


Usually I balk at the idea of of someone describing a language feature as “first class” because it seems to wishy washy. But in this thread you’ve shown me that maybe the best definition is through “proof by negation,” by patiently responding to arguments and demonstrating why a certain usage and the ensuing restriction around it means it is not first class. Bravo!


I agree the term is often abused, but I think the wikipedia page actually does a decent job of making the notion somewhat precise, along the lines I've been arguing here:

https://en.wikipedia.org/wiki/First-class_citizen

If you want to see what queries as first-class values looks like, LINQ in .NET is pretty close. I can actually write a series of queries that build on and compose with each other, like this:

    IQueryable<Person> RunQuery(int userSelection)
    {
        var first = from x in People
                    select x;
        var second = userSelection == 1
            ? from x in first where x.Birthday > '2000-01-01' select x
            : from x in first where x.Name.Contains("Jane") select x;
        return DumbJoin(first, second);
    }

    IQueryable<Person> DumbJoin(IQueryable<Person> first, IQueryable<second>)
    {
        return from x in second
               join y in first on y.Role equals x.Role into g
               select g;
    }
This query is nonsense, but it just shows you what composition really looks like when queries are first-class values. I wish raw SQL were like this!


> You're still not getting it. First-class status means that anywhere a value or variable can be used, a query or table should also be able to appear, and vice versa. This means a table or query can appear as a return type, a parameter to a stored procedure or query, a variable, and so on.

I doubt you could implement a query planner that would cope with that degree of flexibility. Which means you’d be forced to deal with the mechanics of the query, pushing you away from declarative SQL and into procedural and functional programming. At which point you might as well ditch SQL anyway.


Without these features, people have to resort to dynamically generated SQL queries in procedural or functional languages, which is much worse! SQL has also become significantly more complicated by adding all sorts of second-class features to get around this composability limitation (CTEs, table valued functions, views, etc.).

Besides, I don't think it would be as bad as you say. You can approach it as a simple template expansion into flat SQL queries except where a data dependency occurs, at which point template expansion proceeds in stages, one for each dependency.

LINQ on .NET provides most of the composability I'm talking about, although it has a few limitations as well. Still worlds better than raw SQL.


In PostgreSQL at least, a table can appear as a return type of a function and as a parameter to a function. That's not nothing.


What if I wrote a very long, complicated query that I'd like to test against different tables (like test tables), and let's say I can't use stored functions or procedures. How could I pass different tables to my query?


  CREATE TABLE data_a AS (SELECT 'a' AS test_case, 1 AS value);
  CREATE TABLE data_b AS (SELECT 'b' AS test_case, 2 AS value);
  CREATE VIEW data AS (SELECT * FROM data_a UNION ALL SELECT * FROM data_b);

  CREATE VIEW complicated_query AS (SELECT test_case, value+1 FROM data);

  SELECT * FROM complicated_query WHERE test_case = 'a';
  SELECT * FROM complicated_query WHERE test_case = 'b';


Nice, that is what I was looking for. Of course, it'd need to point to production data as well, so maybe test_case is null, in that case:

  CREATE TABLE data_a AS (SELECT 'a' AS test_case, 1 AS value);
  CREATE TABLE data_b AS (SELECT 'b' AS test_case, 2 AS value);
  CREATE TABLE data_prod AS (SELECT NULL AS test_case, prod_table.value FROM prod_table);

  CREATE VIEW data AS (SELECT * FROM data_a UNION ALL SELECT * FROM data_b UNION ALL SELECT * FROM data_prod);

  CREATE VIEW complicated_query AS (SELECT test_case, value+1 FROM data);

  -- when testing
  SELECT * FROM complicated_query WHERE test_case = 'a';
  SELECT * FROM complicated_query WHERE test_case = 'b';

  -- when in 'production'
  SELECT * FROM complicated_query WHERE test_case IS NULL;


You just reinvented defunctionalization, which is a transformation from a domain that has first-class values to a domain where support is only second-class. Defunctionalization is typically used in programming languages to simulate first-class functions in languages where functions are only second-class citizens, like C and Pascal.

This perfectly illustrates my point. You had to manually defunctionalize your data model and queries to support what I'm saying should be inherently part of SQL.


> languages where functions are only second-class citizens, like C and Pascal.

1) Only if you define Pascal as only Wirth's very first version. That changed almost immediately.

2) Only if you refuse to equate “pointer to function” with “function”. Which in C, where “everything is a pointer” (a bit like in Unix Linux “everything is a file”), seems rather silly.


Check out dbt - it's a great tool for organizing queries and solving such patterns


If you can't use stored procedures which are good for this very case, many databases offer dynamic SQL. That might work in some cases.


That's a good point. This would rule out SQLite for me.


Dynamic SQL isn’t SQL, and it’s not relational. It’s no different from using a language like Python to generate SQL queries.


It's a little different. Anyway, this is under the constraint "no stored procedures."


TVF ( Table-Valued Function) with Cross apply.


Detailed web event telemetry is where I have seen the "biggest" data, not application-generated data. Orders, customers, products will always be within reasonable limits. Generating 100s of events (and their associated properties) for every single page/app view to track impressions, clicks, scrolls, page-quality measurements can get you to billions of rows and TBs of data pretty quickly for a moderately popular site. Convincing technical leaders to delete old, unused data has been difficult; convincing product owners to instrument fewer events is even harder.


My idea is for everyone to have their own personal website/app/space (could be very basic, prebuilt templates, drag and drop, something your grandparents could set up). That would then lead to the development of social networking protocols or being able to subscribe to web content modules. Basically, I want RSS feeds for web components/modules, but then a personal portal to interact with the items i subscribe to.

I, as userA, with site www.squarespace.com/userA, could subscribe to all or part of userB's site www.wix.com/userB or www.userB.com/photos but not www.userB.com/crazyBlog. Then, on your own site/app, you choose the things you are subscribed to that you want to "re-publish" or add comments to or share. userB could also choose to not let you follow their space.

This decentralizes away from any particular company and should limit the unintentional crazy that is broadcast across current platforms.


I think email might be a good system to model this on. In addition to an inbox, almost all providers provide a Spam folder, and others like Gmail separate items into 'Promotions' and 'Social' folders/labels. I imagine almost nobody objects to this.

Why can't social media follow a similar methodology? There is no requirement that FB/Twitter/Insta/etc feeds be a single "unit". The primary experience would be a main feed (uncontroversial), but additional feeds/labels would be available to view platform-labeled content. A "Spam Feed" and a "Controversial Feed" and a "This Might Be Misinformation Feed".

Rather than censoring content, it segregates it. Users are free to seek/view that content, but must implicitly acknowledge the platform's opinion by clicking into that content. Just like you know you are looking at "something else" when you go to your email Spam folder, you would be aware that you are venturing off the beaten path when going to the "Potential State-Sponsored Propaganda Feed". There must be some implicit trust in a singular feed which is why current removal/censorship schemas cause such "passionate" responses.


My IQ is a perfect 100!


Ayy 50th percentile gang


Awkward syntax — when developing the query, commenting out the final line of the SELECT list causes a syntax error because of how commas are handled, and we need to repeat the columns in the GROUP BY clause in the SELECT list.

There are some SQL varieties that actually allow a hanging comma! Also, the provided examples seem comma-dependent, no?

As someone who writes a ton of analytical SQL, i think this would get super messy for long, complex queries with casting, case statements, windows functions, etc.

Most people just need to learn to write better SQL!


Was an actuary for 8 years, got my Associateship (roughly halfway through the progression to Fellow). Failed a few of the exams a couple of times, passed a few on the first try. At the time, each exam had a roughly 40-60% pass rate, so by the time you get to the 4th or 5th exam, you are dealing with a pretty smart, invested group...AND it is still difficult to pass.

My job mainly consisted of statistics and data applied specifically to insurance, which I found very boring. I was in life insurance, which is quite simple (people only die once!). Health insurance would have been more interesting, but I am in favor of single payor, so this would have been difficult for me. Property & Casualty (vehicles, property, events, umbrella, custom, etc) would have been the most interesting.

Once I decided to leave, I got an MBA (core courses were trivial due to actuarial knowledge of stats, finance, econ, accounting, etc), and ended up in Analytics/DataScience at tech companies, where my skills transferred quite well.


Congratulations from someone who has passed three of the exams, and then bailed.

I would hire anyone who has passed even a single SOA/CAS exam in a Data Science role in a heartbeat - likely above most other candidates. While I don't directly monitor the "State of the Actuaries", I would have expected the industry to better poise itself as the penultimate "Data Science" candidate-incubator and stretch beyond insurance. I haven't seen an actuarial program that didn't cover computing science as part of degree requirements - and in my case, I also met all degree requirements for BSc Statistics. I did see however the SOA added a predictive analytics exam, exercised in R language.... so that's a start.

/resists urge to bash the term "Data Science" as a Science... because it's really just a combination of {actuarial,stats,cs} which are real science disciplines.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: