Compare commits

..

43 Commits
4.10 ... 2.0.2

Author SHA1 Message Date
Tim Abbott
f8d74fab08 Release Zulip Server 2.0.2. 2019-03-15 11:39:10 -07:00
vsvipul
049b83f0bb image-action: Fix open and download hover highlight in night mode.
When we try to hover over Open or Download they were not highlighted
in night mode, because of incorrect specificity. This commit adds
highlighting in night mode (possibly fixing a regression when we made
night mode less aggressive about hover).

Fixes #11887.
2019-03-15 11:39:10 -07:00
Tim Abbott
bffa709ec8 auth: Use HTTP status 404 for invalid realms.
Apparently, our invalid realm error page had HTTP status 200, which
could be confusing and in particular broken our mobile app's error
handling for this case.
2019-03-15 11:32:28 -07:00
Tim Abbott
ddae999601 send_test_email: Clean up output and provide advice.
Previously, while we sent emails using both noreply addresses, we
didn't make clear what was going on, leading to some potential
confusion.
2019-03-15 11:32:12 -07:00
vsvipul
57cd185366 portico: Fix broken electron check condition for password reset.
This logic for passing through whether the user was logged in never
worked, because we were trying to read the client.

Fix this, and add tests to ensure it never breaks again.

Restructured by tabbott to have completely different code with the
same intent.

Fixes #11802.
2019-03-15 11:32:03 -07:00
Tim Abbott
d39a7ea429 slack import: Fix handling of tombstone files.
Apparently, the mode attribute is not always present.
2019-03-15 11:31:21 -07:00
Tim Abbott
088f8745d1 slack import: Skip processing tombstone files.
The tombstone files undocumented feature of Slack's export format
appears sometimes and has no real data, so we just need to skip these.

Fixes #11619.
2019-03-13 12:55:51 -07:00
Harshit Bansal
f32f02da8b ldap: Ensure email is valid for realm before registering.
Previously, the LDAP authentication model ignored the realm-level
settings for who can join a realm.  This was sort of reasonable at the
time, because the original LDAP auth was an SSO solution that didn't
allow multiple realms, and so one could fully configure authentication
settings on the LDAP side.  But now that we allow multiple realms with
the LDAP backend, one could easily imagine wanting different
restrictions on them, and so it makes sense to add this enforcement.
2019-03-13 12:55:42 -07:00
Anders Kaseorg
76d6d69568 css: Replace generated U+202A LEFT-TO-RIGHT EMBEDDING with CSS properties.
These generated characters (added in #9889) were causing poor wrapping
behavior, at least in Firefox.

Signed-off-by: Anders Kaseorg <andersk@mit.edu>
2019-03-13 12:55:21 -07:00
Tim Abbott
4a1e98f574 stream: Fix validator for stream colors.
Apparently, our new validator for stream color having a valid format
incorrectly handled colors that had duplicate characters in them.

(This is caused in part by the spectrum.js logic automatically
converting #ffff00 to #ff0, which our validator rejected).  Given that
we had old stream colors in the #ff0 format in our database anyway for
legacy, there's no benefit to banning these colors.

In the future, we could imagine standardizing the format, but doing so
will require also changing the frontend to submit colors only in the
6-character format.

Fixes an issue reported in
https://github.com/zulip/zulip/issues/11845#issuecomment-471417073
2019-03-13 12:55:00 -07:00
Eeshan Garg
487632b454 webhooks/zapier: Support authentication for the Zapier Zulip app.
If the user sets up a Zap using an incoming webhook bot's API
key, the authentication goes through our webhook.
2019-03-13 12:54:51 -07:00
Eeshan Garg
848276ee3b webhooks/github: Ignore organization and milestone events.
These events are not super useful and were cluttering up our
webhook logs.
2019-03-13 12:54:48 -07:00
Eeshan Garg
d740b1ae19 webhooks/github: Restrict membership event scope to teams.
According to GitHub's webhook docs, the scope of a membership
event can only be limited to 'teams', which holds true when a
new member is added to a team. However, we just found a payload
in our logs that indicates that when a user is removed from a
team, the scope of the membership is erroneously set to
'organization', not 'team'. This is most likely a bug on
GitHub's end because such behaviour is a direct violation of
their webhook API event specifications. We account for this
by restricting membership events to teams explicitly, at least
till GitHub's docs suggest otherwise.
2019-03-13 12:54:45 -07:00
Tim Abbott
48d8b90863 docs: Recommend using an SSD for the Zulip database.
This is sorta common knowledge for folks who have managed databases,
but not everyone has.
2019-03-13 12:54:31 -07:00
Pragati Agrawal
eeeb947187 node_tests: Refactor test_change_save_button_state in settings_org.
This refactors `test_change_save_button_state` function using ES6 syntax,
to make it more clean and readable.
2019-03-13 12:54:24 -07:00
Pragati Agrawal
507cf1d322 settings_org: Fix visibility time of Saved state.
This fixes the bug where the `Saved` state button faded out almost
instantly (that is actually 300 ms) and `Discard` button fades out
along with `Saved` state button; the key problem here was that the
setTimeout intended to fade was actually delaying the transition from
"saving" to "saved".

Now, first of all, we use `setTimeOut` function to fadeout elements giving
fadeout_delay time as `800 ms` and we hide discard button during `saving`
state. Also, when `Discard` button is selected, `Save changes` and `Dicard`
fade out simultaneously.

Fixes: #11737.
2019-03-13 12:54:21 -07:00
Pragati Agrawal
f3f90bb527 settings_org: Refactor change_save_button_state function.
This makes the `change_save_button_state` funtion more clear and readable
by removing too many occurences of `.find()` and `.attr()` function.
2019-03-13 12:54:17 -07:00
Harshit Bansal
46d6541958 tests: Refactor query_ldap() and add complete test coverage. 2019-03-13 12:54:06 -07:00
Harshit Bansal
13eaa49a42 management: Move query_ldap function to zproject/backends.py.
This will make it simpler to organize and unit-test all of our
authentication backend code.
2019-03-13 12:54:02 -07:00
Tim Abbott
1157aef8b3 night mode: Fix initial state of night mode. 2019-03-13 12:53:33 -07:00
Boris Yankov
65eb125d61 cleanup: Remove unnecessary 'magic' style for night mode.
This was introduced in e0236646

For 1.5 years we did not find a case that needed it (besides the
`a` tag hover state, that is not obvious if it was needed or it was
used as an example)

It is not obvious if this solution was a good idea. The concern was
that `body.night-mode` is more specific than `body` and some styles
might override others less specific in cases we might not want that.

Of course, we want that in the majority of cases, and css-specificity
rules are not simple to comprehend.

Good further reading:
http://cssspecificity.com/
https://specificity.keegan.st/

The added complexity of the resulting styles and the added code that
might not serve any practical purpose seem to not be worth it.
2019-03-13 12:53:30 -07:00
Ben Muschol
713d6739ec linkifiers: Add no-select to trash icon.
This fixes some annoying copy-paste issues we've seen with users
accidentally getting a weird invisible unicode character in their URL
format string when trying to copy-paste an existing linkifier to
use for a new linkifier.

Fixes #10828.
2019-03-13 12:53:06 -07:00
Tim Abbott
70c0c7a83f node: Fix a node test broken by recent narrowing fix.
The changes in 3baf1f3dbd required some
additions to our test setup code.
2019-03-13 12:52:46 -07:00
Tim Abbott
c1ee7692d6 narrow: Remove "subscribe" button for guests for emptry streams.
This button didn't work, because the backend blocks subscribing, so it
was just confusing.

Fixes an issue reported in #11743.
2019-03-13 12:52:37 -07:00
Abhinav Singh
ad336800d0 sidebar: Allow users to use sidebar search in mobile browser.
It was impossible to search people in mobile browsers because virtual
keyboard used to fire resize event and the function call that we used
to handle this event caused the input field to loose focus and this
made it impossible to type in the people search bar.

The code in this commit fixes this by simply ignoring the resize
events when the user wants to search.

Fixes #11795.
2019-03-13 12:52:29 -07:00
Anders Kaseorg
e9e3eafdde drafts: Fix CSS transition when opening drafts.
The code was all there, but we weren't triggering a style calculation.

Signed-off-by: Anders Kaseorg <andersk@mit.edu>
2019-03-13 12:52:14 -07:00
Rohitt Vashishtha
df68a3e963 Revert "bugdown: Process word boundaries properly in realm_filters."
This reverts commit ff90c0101c but keeps
the test cases added for reference.

This was reverted because it was both not a clean solution and created
other realm filters bugs involving dashes (etc.).
2019-03-13 12:51:32 -07:00
Tim Abbott
faaf84bb01 puppet: Fix nginx configuration logic for S3 backend.
Apparently, our testing environment for this configuration was broken
and didn't test the code we thought it did; as a result, a variable
redefinition bug slipped through.

Fixes #11786.
2019-03-13 12:51:11 -07:00
Harshit Bansal
c082547021 ldap: Continue syncing other fields even if a field is missing.
Earlier the behavior was to raise an exception thereby stopping the
whole sync. Now we log an error message and skip the field. Also
fixes the `query_ldap` command to report missing fields without
error.

Fixes: #11780.
2019-03-13 12:50:56 -07:00
Tim Abbott
d6c7199ce1 i18n: Update translation data from Transifex. 2019-03-13 12:49:03 -07:00
Tim Abbott
29b3dd0852 Release Zulip Server 2.0.1. 2019-03-04 17:39:57 -08:00
Tim Abbott
0ffc42083e i18n: Update translations from Transifex. 2019-03-04 17:28:30 -08:00
Tim Abbott
019e5a17f0 docs: Explain options for preventing changes during export.
This makes it a bit clearer that one doesn't need to deactivate a
realm just to export data.
2019-03-04 16:22:18 -08:00
Harshit Bansal
177673c84e portico: Refresh deactivated realm notice page every 60 seconds.
This helps avoid users being confused if a realm was temporarily
deactivated as part of getting a clean backup.

Fixes: #11757.
2019-03-04 16:22:10 -08:00
Harshit Bansal
f6c1a31988 auth: Remove invalid_subdomain restriction from LDAP backend.
Fixes: #11692.
2019-03-04 16:22:04 -08:00
Tim Abbott
870cd00f5f su_to_zulip: Fix detection of zulip user ID.
Apparently, while upgrade-zulip-from-git always ensures that zulip
deployment directories are owned by the Zulip user, unpack-zulip (aka
the tarball code path) has them owned by root.

The user ID detection logic in su_to_zulip's helper get_zulip_uid was
intended to support both development environments (where the user ID
might vary) and production environments.  For development
environments, the existing code is fine, but given this unpack-zulip
permissions issue, we need to have code to fallback to 'zulip' if the
detection logic detects the "zulip" user has having UID 0.
2019-03-04 16:21:53 -08:00
Aaron Raimist
7db599deaa docs: Fix Learn more about mentions link.
It seems like 1871d00bb2 renamed `/help/at-mention-a-user` to `/help/mention-a-user-or-group` but missed this link that shows up on the "You haven't been mentioned yet!" screen. Right now it leads to a "no such article page".
2019-03-04 11:12:56 -08:00
Tim Abbott
84d2be5e0c docs: Fix export/import manage.py instructions typos.
Fixes #11755.
2019-03-04 11:12:48 -08:00
Tim Abbott
d360833d7f nginx: Restructure how we manage uploaded file routes.
The overall goal of this change is to fix an issue where on Ubuntu
Trusty, we were accidentally overriding the configuration to serve
uploads from disk with the regular expressions for adding access
control headers.

However, while investigating this, it became clear that we could
considerably simplify the mental energy required to understand this
system by making the uploads-route file be unconditionally available
and included from `zulip-include/app` (which means the zulip_ops code
can share behavior here).

We also move the Access-Control-Allow-* headers to a separate include
file, to avoid duplicating it in 5 places.  Fixing this duplication
discovered a potential bug in the settings used for Tornado, where
DELETE was not allowed on a route that definitely expects DELETE.

Fixes #11758.
2019-03-04 11:12:44 -08:00
Tim Abbott
bc3db1701b realm_logo: Fix synchronization of realm night logo.
The night logo synchronization on the settings page was perfect, but
the actual display logic had a few problems:

* We were including the realm_logo in context_processors, even though
  it is only used in home.py.
* We used different variable names for the templating in navbar.html
  than anywhere else the codebase.

* The behavior that the night logo would default to the day logo if
  only one was uploaded was not correctly implemented for the navbar
  position, either in the synchronization for updates code or the
  logic in the navbar.html templates.
2019-03-04 11:12:36 -08:00
Rishi Gupta
e8aca7b723 help: Reorganize stream-permissions table. 2019-03-04 11:12:32 -08:00
Tim Abbott
7a72390710 copy: Fix extra space before > in copy-paste styling. 2019-03-04 11:12:11 -08:00
Boris Yankov
3ffe4ca3e5 user status: Make "unavailable" status circle grey.
After discussion, we decided that the red color is too distinct
and does not convey the idea of "almost offline".

This changes the new "unavailable" status circle's color from dark
red to grey, the same color used by the "offline" status circle.
2019-03-04 11:11:52 -08:00
4368 changed files with 366572 additions and 621438 deletions

View File

@@ -1,5 +0,0 @@
> 0.15%
> 0.15% in US
last 2 versions
Firefox ESR
not dead

161
.circleci/config.yml Normal file
View File

@@ -0,0 +1,161 @@
# See https://zulip.readthedocs.io/en/latest/testing/continuous-integration.html for
# high-level documentation on our CircleCI setup.
# See CircleCI upstream's docs on this config format:
# https://circleci.com/docs/2.0/language-python/
#
version: 2
aliases:
- &create_cache_directories
run:
name: create cache directories
command: |
dirs=(/srv/zulip-{npm,venv}-cache)
sudo mkdir -p "${dirs[@]}"
sudo chown -R circleci "${dirs[@]}"
- &restore_cache_package_json
restore_cache:
keys:
- v1-npm-base.{{ .Environment.CIRCLE_JOB }}-{{ checksum "package.json" }}-{{ checksum "yarn.lock" }}
- &restore_cache_requirements
restore_cache:
keys:
- v1-venv-base.{{ .Environment.CIRCLE_JOB }}-{{ checksum "requirements/thumbor.txt" }}-{{ checksum "requirements/dev.txt" }}
- &install_dependencies
run:
name: install dependencies
command: |
sudo apt-get update
# Install moreutils so we can use `ts` and `mispipe` in the following.
sudo apt-get install -y moreutils
# CircleCI sets the following in Git config at clone time:
# url.ssh://git@github.com.insteadOf https://github.com
# This breaks the Git clones in the NVM `install.sh` we run
# in `install-node`.
# TODO: figure out why that breaks, and whether we want it.
# (Is it an optimization?)
rm -f /home/circleci/.gitconfig
# This is the main setup job for the test suite
mispipe "tools/ci/setup-backend" ts
# Cleaning caches is mostly unnecessary in Circle, because
# most builds don't get to write to the cache.
# mispipe "scripts/lib/clean-unused-caches --verbose --threshold 0" ts
- &save_cache_package_json
save_cache:
paths:
- /srv/zulip-npm-cache
key: v1-npm-base.{{ .Environment.CIRCLE_JOB }}-{{ checksum "package.json" }}-{{ checksum "yarn.lock" }}
- &save_cache_requirements
save_cache:
paths:
- /srv/zulip-venv-cache
key: v1-venv-base.{{ .Environment.CIRCLE_JOB }}-{{ checksum "requirements/thumbor.txt" }}-{{ checksum "requirements/dev.txt" }}
# TODO: in Travis we also cache ~/zulip-emoji-cache, ~/node, ~/misc
- &run_backend_tests
run:
name: run backend tests
command: |
. /srv/zulip-py3-venv/bin/activate
mispipe ./tools/ci/backend ts
- &run_frontend_tests
run:
name: run frontend tests
command: |
. /srv/zulip-py3-venv/bin/activate
mispipe ./tools/ci/frontend ts
- &upload_coverage_report
run:
name: upload coverage report
command: |
. /srv/zulip-py3-venv/bin/activate
pip install codecov && codecov \
|| echo "Error in uploading coverage reports to codecov.io."
jobs:
"trusty-python-3.4":
docker:
# This is built from tools/circleci/images/trusty/Dockerfile .
- image: gregprice/circleci:trusty-python-5.test
working_directory: ~/zulip
steps:
- checkout
- *create_cache_directories
- *restore_cache_package_json
- *restore_cache_requirements
- *install_dependencies
- *save_cache_package_json
- *save_cache_requirements
- *run_backend_tests
- *run_frontend_tests
- *upload_coverage_report
# - store_artifacts: # TODO
# path: var/casper/
# # also /tmp/zulip-test-event-log/
# destination: test-reports
"xenial-python-3.5":
docker:
# This is built from tools/circleci/images/xenial/Dockerfile .
- image: gregprice/circleci:xenial-python-4.test
working_directory: ~/zulip
steps:
- checkout
- *create_cache_directories
- *restore_cache_package_json
- *restore_cache_requirements
- *install_dependencies
- *save_cache_package_json
- *save_cache_requirements
- *run_backend_tests
- *upload_coverage_report
"bionic-python-3.6":
docker:
# This is built from tools/circleci/images/bionic/Dockerfile .
- image: gregprice/circleci:bionic-python-1.test
working_directory: ~/zulip
steps:
- checkout
- *create_cache_directories
- run:
name: do Bionic hack
command: |
# Temporary hack till `sudo service redis-server start` gets fixes in Bionic. See
# https://chat.zulip.org/#narrow/stream/3-backend/topic/Ubuntu.20bionic.20CircleCI
redis-server --daemonize yes
- *restore_cache_package_json
- *restore_cache_requirements
- *install_dependencies
- *save_cache_package_json
- *save_cache_requirements
- *run_backend_tests
- *upload_coverage_report
workflows:
version: 2
build:
jobs:
- "trusty-python-3.4"
- "xenial-python-3.5"
- "bionic-python-3.6"

View File

@@ -5,8 +5,6 @@ coverage:
project: project:
default: default:
target: auto target: auto
# Codecov has the tendency to report a lot of false negatives, threshold: 0.50
# so we basically suppress comments completely.
threshold: 50%
base: auto base: auto
patch: off patch: off

View File

@@ -3,22 +3,23 @@ root = true
[*] [*]
end_of_line = lf end_of_line = lf
charset = utf-8 charset = utf-8
indent_size = 4
indent_style = space
insert_final_newline = true
trim_trailing_whitespace = true trim_trailing_whitespace = true
insert_final_newline = true
binary_next_line = true # for shfmt [*.{sh,py,pyi,js,json,yml,xml,css,md,markdown,handlebars,html}]
switch_case_indent = true # for shfmt indent_style = space
indent_size = 4
[{*.{js,json,ts},check-openapi}] [*.{py}]
max_line_length = 100
[*.{py,pyi}]
max_line_length = 110 max_line_length = 110
[*.{md,svg,rb,pp,yaml,yml}] [*.{js}]
max_line_length = 120
[*.{svg,rb,pp,pl}]
indent_style = space
indent_size = 2 indent_size = 2
[package.json] [*.{cfg}]
indent_size = 2 indent_style = space
indent_size = 8

View File

@@ -1,14 +1,2 @@
# This is intended for generated files and vendored third-party files. static/js/blueslip.js
# For our source code, instead of adding files here, consider using static/webpack-bundles
# specific eslint-disable comments in the files themselves.
/docs/_build
/static/generated
/static/third
/static/webpack-bundles
/var/*
!/var/puppeteer
/var/puppeteer/*
!/var/puppeteer/test_credentials.d.ts
/zulip-current-venv
/zulip-py3-venv

View File

@@ -1,230 +1,411 @@
{ {
"env": { "env": {
"es2020": true, "node": true,
"node": true "es6": true
}, },
"extends": [
"eslint:recommended",
"plugin:import/errors",
"plugin:import/warnings",
"plugin:unicorn/recommended",
"prettier"
],
"parser": "@babel/eslint-parser",
"parserOptions": { "parserOptions": {
"warnOnUnsupportedTypeScriptVersion": false, "sourceType": "module"
"sourceType": "unambiguous"
}, },
"reportUnusedDisableDirectives": true, "globals": {
"$": false,
"ClipboardJS": false,
"Dict": false,
"FetchStatus": false,
"Filter": false,
"Handlebars": false,
"LightboxCanvas": false,
"MessageListData": false,
"MessageListView": false,
"PerfectScrollbar": false,
"Plotly": false,
"SockJS": false,
"Socket": false,
"Sortable": false,
"WinChan": false,
"XDate": false,
"_": false,
"activity": false,
"admin": false,
"alert_words": false,
"alert_words_ui": false,
"attachments_ui": false,
"avatar": false,
"billing": false,
"blueslip": false,
"bot_data": false,
"bridge": false,
"buddy_data": false,
"buddy_list": false,
"channel": false,
"click_handlers": false,
"color_data": false,
"colorspace": false,
"common": false,
"components": false,
"compose": false,
"compose_actions": false,
"compose_fade": false,
"compose_pm_pill": false,
"compose_state": false,
"compose_ui": false,
"composebox_typeahead": false,
"condense": false,
"confirm_dialog": false,
"copy_and_paste": false,
"csrf_token": false,
"current_msg_list": true,
"drafts": false,
"echo": false,
"emoji": false,
"emoji_codes": false,
"emoji_picker": false,
"favicon": false,
"feature_flags": false,
"feedback_widget": false,
"fenced_code": false,
"flatpickr": false,
"floating_recipient_bar": false,
"gear_menu": false,
"hash_util": false,
"hashchange": false,
"helpers": false,
"home_msg_list": false,
"hotspots": false,
"i18n": false,
"info_overlay": false,
"input_pill": false,
"invite": false,
"jQuery": false,
"katex": false,
"keydown_util": false,
"lightbox": false,
"list_cursor": false,
"list_render": false,
"list_util": false,
"loading": false,
"localStorage": false,
"local_message": false,
"localstorage": false,
"markdown": false,
"marked": false,
"md5": false,
"message_edit": false,
"message_events": false,
"message_fetch": false,
"message_flags": false,
"message_list": false,
"message_live_update": false,
"message_scroll": false,
"message_store": false,
"message_util": false,
"message_viewport": false,
"moment": false,
"muting": false,
"muting_ui": false,
"narrow": false,
"narrow_state": false,
"navigate": false,
"night_mode": false,
"notifications": false,
"overlays": false,
"padded_widget": false,
"page_params": false,
"panels": false,
"people": false,
"pm_conversations": false,
"pm_list": false,
"pointer": false,
"popovers": false,
"presence": false,
"pygments_data": false,
"reactions": false,
"realm_icon": false,
"realm_logo": false,
"realm_night_logo": false,
"recent_senders": false,
"reload": false,
"reload_state": false,
"reminder": false,
"resize": false,
"rows": false,
"rtl": false,
"run_test": false,
"schema": false,
"scroll_bar": false,
"scroll_util": false,
"search": false,
"search_pill": false,
"search_pill_widget": false,
"search_suggestion": false,
"search_util": false,
"sent_messages": false,
"server_events": false,
"server_events_dispatch": false,
"settings": false,
"settings_account": false,
"settings_bots": false,
"settings_display": false,
"settings_emoji": false,
"settings_linkifiers": false,
"settings_invites": false,
"settings_muting": false,
"settings_notifications": false,
"settings_org": false,
"settings_panel_menu": false,
"settings_profile_fields": false,
"settings_sections": false,
"settings_streams": false,
"settings_toggle": false,
"settings_ui": false,
"settings_user_groups": false,
"settings_users": false,
"starred_messages": false,
"stream_color": false,
"stream_create": false,
"stream_data": false,
"stream_edit": false,
"stream_events": false,
"stream_list": false,
"stream_muting": false,
"stream_popover": false,
"stream_sort": false,
"StripeCheckout": false,
"submessage": false,
"subs": false,
"tab_bar": false,
"templates": false,
"tictactoe_widget": false,
"timerender": false,
"toMarkdown": false,
"todo_widget": false,
"top_left_corner": false,
"topic_data": false,
"topic_generator": false,
"topic_list": false,
"topic_zoom": false,
"transmit": false,
"tutorial": false,
"typeahead_helper": false,
"typing": false,
"typing_data": false,
"typing_events": false,
"typing_status": false,
"ui": false,
"ui_init": false,
"ui_report": false,
"ui_util": false,
"unread": false,
"unread_ops": false,
"unread_ui": false,
"upgrade": false,
"upload": false,
"upload_widget": false,
"user_events": false,
"user_groups": false,
"user_pill": false,
"user_search": false,
"user_status": false,
"user_status_ui": false,
"util": false,
"poll_widget": false,
"widgetize": false,
"zcommand": false,
"zform": false,
"zxcvbn": false
},
"plugins": [
"eslint-plugin-empty-returns"
],
"rules": { "rules": {
"array-callback-return": "error", "array-callback-return": "error",
"arrow-body-style": "error", "array-bracket-spacing": "error",
"block-scoped-var": "error", "arrow-spacing": [ "error", { "before": true, "after": true } ],
"consistent-return": "error", "block-scoped-var": 2,
"curly": "error", "brace-style": [ "error", "1tbs", { "allowSingleLine": true } ],
"dot-notation": "error", "camelcase": 0,
"eqeqeq": "error", "comma-dangle": [ "error",
"guard-for-in": "error",
"import/extensions": "error",
"import/first": "error",
"import/newline-after-import": "error",
"import/no-useless-path-segments": "error",
"import/order": [
"error",
{ {
"alphabetize": {"order": "asc"}, "arrays": "always-multiline",
"newlines-between": "always" "objects": "always-multiline",
"imports": "always-multiline",
"exports": "always-multiline",
"functions": "never"
} }
], ],
"import/unambiguous": "error", "comma-spacing": [ "error",
"lines-around-directive": "error",
"new-cap": "error",
"no-alert": "error",
"no-array-constructor": "error",
"no-bitwise": "error",
"no-caller": "error",
"no-catch-shadow": "error",
"no-constant-condition": ["error", {"checkLoops": false}],
"no-div-regex": "error",
"no-duplicate-imports": "error",
"no-else-return": "error",
"no-eq-null": "error",
"no-eval": "error",
"no-implicit-coercion": "error",
"no-implied-eval": "error",
"no-inner-declarations": "off",
"no-iterator": "error",
"no-label-var": "error",
"no-labels": "error",
"no-loop-func": "error",
"no-multi-str": "error",
"no-native-reassign": "error",
"no-new-func": "error",
"no-new-object": "error",
"no-new-wrappers": "error",
"no-octal-escape": "error",
"no-plusplus": "error",
"no-proto": "error",
"no-return-assign": "error",
"no-script-url": "error",
"no-self-compare": "error",
"no-sync": "error",
"no-throw-literal": "error",
"no-undef-init": "error",
"no-unneeded-ternary": ["error", {"defaultAssignment": false}],
"no-unused-expressions": "error",
"no-use-before-define": ["error", {"functions": false}],
"no-useless-concat": "error",
"no-useless-constructor": "error",
"no-var": "error",
"object-shorthand": "error",
"one-var": ["error", "never"],
"prefer-arrow-callback": "error",
"prefer-const": [
"error",
{ {
"before": false,
"after": true
}
],
"complexity": [ 0, 4 ],
"curly": 2,
"dot-notation": [ "error", { "allowKeywords": true } ],
"empty-returns/main": "error",
"eol-last": [ "error", "always" ],
"eqeqeq": 2,
"func-style": [ "off", "expression" ],
"guard-for-in": 2,
"indent": ["error", 4, {
"ArrayExpression": "first",
"outerIIFEBody": 0,
"ObjectExpression": "first",
"SwitchCase": 0,
"CallExpression": {"arguments": "first"},
"FunctionExpression": {"parameters": "first"},
"FunctionDeclaration": {"parameters": "first"}
}],
"key-spacing": [ "error",
{
"beforeColon": false,
"afterColon": true
}
],
"keyword-spacing": [ "error",
{
"before": true,
"after": true,
"overrides": {
"return": { "after": true },
"throw": { "after": true },
"case": { "after": true }
}
}
],
"max-depth": [ 0, 4 ],
"max-len": [ "error", 100, 2,
{
"ignoreUrls": true,
"ignoreComments": false,
"ignoreRegExpLiterals": true,
"ignoreStrings": true,
"ignoreTemplateLiterals": true
}
],
"max-params": [ 0, 3 ],
"max-statements": [ 0, 10 ],
"new-cap": [ "error",
{
"newIsCap": true,
"capIsNew": false
}
],
"new-parens": 2,
"newline-per-chained-call": 0,
"no-alert": 2,
"no-array-constructor": "error",
"no-bitwise": 2,
"no-caller": 2,
"no-case-declarations": "error",
"no-catch-shadow": 2,
"no-console": 0,
"no-const-assign": "error",
"no-control-regex": 2,
"no-debugger": 2,
"no-delete-var": 2,
"no-div-regex": 2,
"no-dupe-class-members": "error",
"no-dupe-keys": 2,
"no-duplicate-imports": "error",
"no-else-return": 2,
"no-empty": 2,
"no-empty-character-class": 2,
"no-eq-null": 2,
"no-eval": 2,
"no-ex-assign": 2,
"no-extra-parens": ["error", "all"],
"no-extra-semi": 2,
"no-fallthrough": 2,
"no-floating-decimal": 2,
"no-func-assign": 2,
"no-implied-eval": 2,
"no-iterator": "error",
"no-label-var": 2,
"no-labels": 2,
"no-loop-func": 2,
"no-mixed-requires": [ 0, false ],
"no-multi-str": 2,
"no-native-reassign": 2,
"no-nested-ternary": 0,
"no-new-func": "error",
"no-new-object": 2,
"no-new-wrappers": 2,
"no-obj-calls": 2,
"no-octal": 2,
"no-octal-escape": 2,
"no-param-reassign": 0,
"no-plusplus": 2,
"no-proto": 2,
"no-redeclare": 2,
"no-regex-spaces": 2,
"no-restricted-syntax": 0,
"no-return-assign": 2,
"no-script-url": 2,
"no-self-compare": 2,
"no-shadow": 0,
"no-sync": 2,
"no-ternary": 0,
"no-trailing-spaces": 2,
"no-undef": "error",
"no-undef-init": 2,
"no-underscore-dangle": 0,
"no-unneeded-ternary": [ "error", { "defaultAssignment": false } ],
"no-unreachable": 2,
"no-unused-expressions": 2,
"no-unused-vars": [ "error",
{
"vars": "local",
"args": "after-used",
"varsIgnorePattern": "print_elapsed_time|check_duplicate_ids"
}
],
"no-use-before-define": 2,
"no-useless-constructor": "error",
// The Zulip codebase complies partially with the "no-useless-escape"
// rule; only regex expressions haven't been updated yet.
// Updated regex expressions are currently being tested in casper
// files and will decide about a potential future enforcement of this rule.
"no-useless-escape": 0,
"space-unary-ops": 2,
"no-whitespace-before-property": 2,
"no-with": 2,
"one-var": [ "error", "never" ],
"padded-blocks": 0,
"prefer-const": [ "error",
{
"destructuring": "any",
"ignoreReadBeforeAssign": true "ignoreReadBeforeAssign": true
} }
], ],
"radix": "error", "quote-props": [ "error", "as-needed",
"sort-imports": ["error", {"ignoreDeclarationSort": true}],
"spaced-comment": ["error", "always", {"markers": ["/"]}],
"strict": "error",
"unicorn/consistent-function-scoping": "off",
"unicorn/explicit-length-check": "off",
"unicorn/filename-case": "off",
"unicorn/no-nested-ternary": "off",
"unicorn/no-null": "off",
"unicorn/no-process-exit": "off",
"unicorn/no-useless-undefined": "off",
"unicorn/number-literal-case": "off",
"unicorn/numeric-separators-style": "off",
"unicorn/prefer-module": "off",
"unicorn/prefer-node-protocol": "off",
"unicorn/prefer-spread": "off",
"unicorn/prefer-ternary": "off",
"unicorn/prevent-abbreviations": "off",
"valid-typeof": ["error", {"requireStringLiterals": true}],
"yoda": "error"
},
"overrides": [
{ {
"files": ["frontend_tests/puppeteer_lib/**", "frontend_tests/puppeteer_tests/**"], "keywords": false,
"globals": { "unnecessary": true,
"$": false, "numbers": false
"zulip_test": false
} }
},
{
"files": ["static/js/**"],
"globals": {
"StripeCheckout": false
}
},
{
"files": ["**/*.ts"],
"extends": ["plugin:@typescript-eslint/recommended", "plugin:import/typescript"],
"parserOptions": {
"project": "tsconfig.json"
},
"rules": {
// Disable base rule to avoid conflict
"no-duplicate-imports": "off",
"no-unused-vars": "off",
"no-useless-constructor": "off",
"@typescript-eslint/array-type": "error",
"@typescript-eslint/await-thenable": "error",
"@typescript-eslint/consistent-type-assertions": "error",
"@typescript-eslint/consistent-type-imports": "error",
"@typescript-eslint/explicit-function-return-type": [
"error",
{"allowExpressions": true}
], ],
"@typescript-eslint/member-ordering": "error", "quotes": [ 0, "single" ],
"@typescript-eslint/no-duplicate-imports": "off", "radix": 2,
"@typescript-eslint/no-explicit-any": "off", "semi": 2,
"@typescript-eslint/no-extraneous-class": "error", "semi-spacing": [2, {"before": false, "after": true}],
"@typescript-eslint/no-non-null-assertion": "off", "space-before-blocks": 2,
"@typescript-eslint/no-parameter-properties": "error", "space-before-function-paren": [ "error",
"@typescript-eslint/no-unnecessary-qualifier": "error",
"@typescript-eslint/no-unnecessary-type-assertion": "error",
"@typescript-eslint/no-unused-vars": ["error", {"varsIgnorePattern": "^_"}],
"@typescript-eslint/no-use-before-define": "error",
"@typescript-eslint/no-useless-constructor": "error",
"@typescript-eslint/prefer-includes": "error",
"@typescript-eslint/prefer-regexp-exec": "error",
"@typescript-eslint/prefer-string-starts-ends-with": "error",
"@typescript-eslint/promise-function-async": "error",
"@typescript-eslint/unified-signatures": "error",
"no-undef": "error"
}
},
{ {
"files": ["**/*.d.ts"], "anonymous": "always",
"rules": { "named": "never",
"import/unambiguous": "off" "asyncArrow": "always"
} }
}, ],
{ "space-in-parens": 2,
"files": ["frontend_tests/**"], "space-infix-ops": 2,
"globals": { "spaced-comment": 0,
"CSS": false, "strict": 0,
"document": false, "template-curly-spacing": "error",
"navigator": false, "unnecessary-strict": 0,
"window": false "use-isnan": 2,
}, "valid-typeof": [ "error", { "requireStringLiterals": true } ],
"rules": { "wrap-iife": [ "error", "outside", { "functionPrototypeMethods": false } ],
"no-sync": "off" "wrap-regex": 0,
"yoda": 2
} }
},
{
"files": ["tools/debug-require.js"],
"env": {
"browser": true,
"es2020": false
},
"rules": {
// Dont require ES features that PhantomJS doesnt support
// TODO: Toggle these settings now that we don't use PhantomJS
"no-var": "off",
"object-shorthand": "off",
"prefer-arrow-callback": "off"
}
},
{
"files": ["static/**"],
"env": {
"browser": true,
"node": false
},
"rules": {
"no-console": "error"
},
"settings": {
"import/resolver": "webpack"
}
},
{
"files": ["static/shared/**"],
"env": {
"browser": false,
"shared-node-browser": true
},
"rules": {
"import/no-restricted-paths": [
"error",
{
"zones": [
{
"target": "./static/shared",
"from": ".",
"except": ["./node_modules", "./static/shared"]
}
]
}
]
}
}
]
} }

1
.gitattributes vendored
View File

@@ -11,3 +11,4 @@
*.otf binary *.otf binary
*.tif binary *.tif binary
*.ogg binary *.ogg binary
yarn.lock binary

3
.github/FUNDING.yml vendored
View File

@@ -1,3 +0,0 @@
github: zulip
patreon: zulip
open_collective: zulip

View File

@@ -1,11 +1,14 @@
<!-- What's this PR for? (Just a link to an issue is fine.) --> <!-- What's this PR for? (Just a link to an issue is fine.) -->
**Testing plan:** <!-- How have you tested? -->
**GIFs or screenshots:** <!-- If a UI change. See: **Testing Plan:** <!-- How have you tested? -->
**GIFs or Screenshots:** <!-- If a UI change. See:
https://zulip.readthedocs.io/en/latest/tutorials/screenshot-and-gif-software.html https://zulip.readthedocs.io/en/latest/tutorials/screenshot-and-gif-software.html
--> -->
<!-- Also be sure to make clear, coherent commits: <!-- Also be sure to make clear, coherent commits:
https://zulip.readthedocs.io/en/latest/contributing/version-control.html https://zulip.readthedocs.io/en/latest/contributing/version-control.html
--> -->

View File

@@ -1,43 +0,0 @@
name: Cancel previous runs
on: [push, pull_request]
defaults:
run:
shell: bash
jobs:
cancel:
name: Cancel previous runs
runs-on: ubuntu-latest
timeout-minutes: 3
# Don't run this job for zulip/zulip pushes since we
# want to run those jobs.
if: ${{ github.event_name != 'push' || github.event.repository.full_name != 'zulip/zulip' }}
steps:
# We get workflow IDs from GitHub API so we don't have to maintain
# a hard-coded list of IDs which need to be updated when a workflow
# is added or removed. And, workflow IDs are different for other forks
# so this is required.
- name: Get workflow IDs.
id: workflow_ids
continue-on-error: true # Don't fail this job on failure
env:
# This is in <owner>/<repo> format e.g. zulip/zulip
REPOSITORY: ${{ github.repository }}
run: |
workflow_api_url=https://api.github.com/repos/$REPOSITORY/actions/workflows
curl $workflow_api_url -o workflows.json
script="const {workflows} = require('./workflows'); \
const ids = workflows.map(workflow => workflow.id); \
console.log(ids.join(','));"
ids=$(node -e "$script")
echo "::set-output name=ids::$ids"
- uses: styfle/cancel-workflow-action@0.9.0
continue-on-error: true # Don't fail this job on failure
with:
workflow_id: ${{ steps.workflow_ids.outputs.ids }}
access_token: ${{ github.token }}

View File

@@ -1,31 +0,0 @@
name: "Code scanning"
on: [push, pull_request]
jobs:
CodeQL:
if: ${{!github.event.repository.private}}
runs-on: ubuntu-latest
steps:
- name: Check out repository
uses: actions/checkout@v2
with:
# We must fetch at least the immediate parents so that if this is
# a pull request then we can check out the head.
fetch-depth: 2
# If this run was triggered by a pull request event, then check out
# the head of the pull request instead of the merge commit.
- run: git checkout HEAD^2
if: ${{ github.event_name == 'pull_request' }}
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v1
# Override language selection by uncommenting this and choosing your languages
# with:
# languages: go, javascript, csharp, python, cpp, java
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v1

View File

@@ -1,275 +0,0 @@
name: Zulip production suite
on:
push:
paths:
- "**/migrations/**"
- puppet/**
- requirements/**
- scripts/**
- static/**
- tools/**
- zproject/**
- yarn.lock
- .github/workflows/production-suite.yml
pull_request:
paths:
- "**/migrations/**"
- puppet/**
- requirements/**
- scripts/**
- static/**
- tools/**
- zproject/**
- yarn.lock
- .github/workflows/production-suite.yml
defaults:
run:
shell: bash
jobs:
production_build:
# This job builds a release tarball from the current commit, which
# will be used for all of the following install/upgrade tests.
name: Bionic production build
runs-on: ubuntu-latest
# This docker image was created by a generated Dockerfile at:
# tools/ci/images/bionic/Dockerfile
# Bionic ships with Python 3.6.
container: zulip/ci:bionic
steps:
- name: Add required permissions
run: |
# The checkout actions doesn't clone to ~/zulip or allow
# us to use the path option to clone outside the current
# /__w/zulip/zulip directory. Since this directory is owned
# by root we need to change it's ownership to allow the
# github user to clone the code here.
# Note: /__w/ is a docker volume mounted to $GITHUB_WORKSPACE
# which is /home/runner/work/.
sudo chown -R github .
# This is the GitHub Actions specific cache directory the
# the current github user must be able to access for the
# cache action to work. It is owned by root currently.
sudo chmod -R 0777 /__w/_temp/
- uses: actions/checkout@v2
- name: Create cache directories
run: |
dirs=(/srv/zulip-{npm,venv,emoji}-cache)
sudo mkdir -p "${dirs[@]}"
sudo chown -R github "${dirs[@]}"
- name: Restore node_modules cache
uses: actions/cache@v2
with:
path: /srv/zulip-npm-cache
key: v1-yarn-deps-bionic-${{ hashFiles('package.json') }}-${{ hashFiles('yarn.lock') }}
restore-keys: v1-yarn-deps-bionic
- name: Restore python cache
uses: actions/cache@v2
with:
path: /srv/zulip-venv-cache
key: v1-venv-bionic-${{ hashFiles('requirements/dev.txt') }}
restore-keys: v1-venv-bionic
- name: Restore emoji cache
uses: actions/cache@v2
with:
path: /srv/zulip-emoji-cache
key: v1-emoji-bionic-${{ hashFiles('tools/setup/emoji/emoji_map.json') }}-${{ hashFiles('tools/setup/emoji/build_emoji') }}-${{ hashFiles('tools/setup/emoji/emoji_setup_utils.py') }}-${{ hashFiles('tools/setup/emoji/emoji_names.py') }}-${{ hashFiles('package.json') }}
restore-keys: v1-emoji-bionic
- name: Do Bionic hack
run: |
# Temporary hack till `sudo service redis-server start` gets fixes in Bionic. See
# https://chat.zulip.org/#narrow/stream/3-backend/topic/Ubuntu.20bionic.20CircleCI
sudo sed -i '/^bind/s/bind.*/bind 0.0.0.0/' /etc/redis/redis.conf
- name: Build production tarball
run: ./tools/ci/production-build
- name: Upload production build artifacts for install jobs
uses: actions/upload-artifact@v2
with:
name: production-tarball
path: /tmp/production-build
retention-days: 14
- name: Report status
if: failure()
env:
ZULIP_BOT_KEY: ${{ secrets.ZULIP_BOT_KEY }}
run: tools/ci/send-failure-message
production_install:
# This job installs the server release tarball built above on a
# range of platforms, and does some basic health checks on the
# resulting installer Zulip server.
strategy:
fail-fast: false
matrix:
include:
# Base images are built using `tools/ci/Dockerfile.template`.
# The comments at the top explain how to build and upload these images.
- docker_image: zulip/ci:bionic
name: Bionic production install
is_bionic: true
os: bionic
- docker_image: zulip/ci:focal
name: Focal production install
is_focal: true
os: focal
- docker_image: zulip/ci:buster
name: Buster production install
is_buster: true
os: buster
- docker_image: zulip/ci:bullseye
name: Bullseye production install
is_bullseye: true
os: bullseye
name: ${{ matrix.name }}
container:
image: ${{ matrix.docker_image }}
options: --init
runs-on: ubuntu-latest
needs: production_build
steps:
- name: Download built production tarball
uses: actions/download-artifact@v2
with:
name: production-tarball
path: /tmp
- name: Add required permissions and setup
run: |
# This is the GitHub Actions specific cache directory the
# the current github user must be able to access for the
# cache action to work. It is owned by root currently.
sudo chmod -R 0777 /__w/_temp/
# Create the zulip directory that the tools/ci/ scripts needs
mkdir -p /home/github/zulip
# Since actions/download-artifact@v2 loses all the permissions
# of the tarball uploaded by the upload artifact fix those.
chmod +x /tmp/production-extract-tarball
chmod +x /tmp/production-upgrade-pg
chmod +x /tmp/production-install
chmod +x /tmp/production-verify
chmod +x /tmp/send-failure-message
- name: Create cache directories
run: |
dirs=(/srv/zulip-{npm,venv,emoji}-cache)
sudo mkdir -p "${dirs[@]}"
sudo chown -R github "${dirs[@]}"
- name: Restore node_modules cache
uses: actions/cache@v2
with:
path: /srv/zulip-npm-cache
key: v1-yarn-deps-${{ matrix.os }}-${{ hashFiles('/tmp/package.json') }}-${{ hashFiles('/tmp/yarn.lock') }}
restore-keys: v1-yarn-deps-${{ matrix.os }}
- name: Do Bionic hack
if: ${{ matrix.is_bionic }}
run: |
# Temporary hack till `sudo service redis-server start` gets fixes in Bionic. See
# https://chat.zulip.org/#narrow/stream/3-backend/topic/Ubuntu.20bionic.20CircleCI
sudo sed -i '/^bind/s/bind.*/bind 0.0.0.0/' /etc/redis/redis.conf
- name: Production extract tarball
run: /tmp/production-extract-tarball
- name: Install production
run: |
sudo service rabbitmq-server restart
sudo /tmp/production-install
- name: Verify install
run: sudo /tmp/production-verify
- name: Upgrade postgresql
if: ${{ matrix.is_bionic }}
run: sudo /tmp/production-upgrade-pg
- name: Verify install after upgrading postgresql
if: ${{ matrix.is_bionic }}
run: sudo /tmp/production-verify
- name: Report status
if: failure()
env:
ZULIP_BOT_KEY: ${{ secrets.ZULIP_BOT_KEY }}
run: /tmp/send-failure-message
production_upgrade:
# The production upgrade job starts with a container with a
# previous Zulip release installed, and attempts to upgrade it to
# the release tarball built for the current commit being tested.
#
# This is intended to catch bugs that result in the upgrade
# process failing.
strategy:
fail-fast: false
matrix:
include:
# Base images are built using `tools/ci/Dockerfile.prod.template`.
# The comments at the top explain how to build and upload these images.
- docker_image: zulip/ci:buster-3.4
name: 3.4 Version Upgrade
is_focal: true
os: buster
name: ${{ matrix.name }}
container:
image: ${{ matrix.docker_image }}
options: --init
runs-on: ubuntu-latest
needs: production_build
steps:
- name: Download built production tarball
uses: actions/download-artifact@v2
with:
name: production-tarball
path: /tmp
- name: Add required permissions and setup
run: |
# This is the GitHub Actions specific cache directory the
# the current github user must be able to access for the
# cache action to work. It is owned by root currently.
sudo chmod -R 0777 /__w/_temp/
# Since actions/download-artifact@v2 loses all the permissions
# of the tarball uploaded by the upload artifact fix those.
chmod +x /tmp/production-upgrade
chmod +x /tmp/production-verify
chmod +x /tmp/send-failure-message
- name: Upgrade production
run: sudo /tmp/production-upgrade
# TODO: We should be running production-verify here, but it
# doesn't pass yet.
#
# - name: Verify install
# run: sudo /tmp/production-verify
- name: Report status
if: failure()
env:
ZULIP_BOT_KEY: ${{ secrets.ZULIP_BOT_KEY }}
run: /tmp/send-failure-message

View File

@@ -1,24 +0,0 @@
name: Update one click apps
on:
release:
types: [published]
jobs:
update-digitalocean-oneclick-app:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Update DigitalOcean one click app
env:
DIGITALOCEAN_API_KEY: ${{ secrets.ONE_CLICK_ACTION_DIGITALOCEAN_API_KEY }}
ZULIP_API_KEY: ${{ secrets.ONE_CLICK_ACTION_ZULIP_BOT_API_KEY }}
ZULIP_EMAIL: ${{ secrets.ONE_CLICK_ACTION_ZULIP_BOT_EMAIL }}
ZULIP_SITE: https://chat.zulip.org
ONE_CLICK_ACTION_STREAM: kandra ops
PYTHON_DIGITALOCEAN_REQUEST_TIMEOUT_SEC: 30
RELEASE_VERSION: ${{ github.event.release.tag_name }}
run: |
export PATH="$HOME/.local/bin:$PATH"
git clone https://github.com/zulip/marketplace-partners
pip3 install python-digitalocean zulip fab-classic
echo $PATH
python3 tools/oneclickapps/prepare_digital_ocean_one_click_app_release.py

View File

@@ -1,250 +0,0 @@
# NOTE: Everything test in this file should be in `tools/test-all`. If there's a
# reason not to run it there, it should be there as a comment
# explaining why.
name: Zulip CI
on: [push, pull_request]
defaults:
run:
shell: bash
jobs:
tests:
strategy:
fail-fast: false
matrix:
include:
# This docker image was created by a generated Dockerfile at:
# tools/ci/images/bionic/Dockerfile
# Bionic ships with Python 3.6.
- docker_image: zulip/ci:bionic
name: Ubuntu 18.04 Bionic (Python 3.6, backend + frontend)
os: bionic
is_bionic: true
include_frontend_tests: true
# This docker image was created by a generated Dockerfile at:
# tools/ci/images/focal/Dockerfile
# Focal ships with Python 3.8.2.
- docker_image: zulip/ci:focal
name: Ubuntu 20.04 Focal (Python 3.8, backend)
os: focal
is_focal: true
include_frontend_tests: false
# This docker image was created by a generated Dockerfile at:
# tools/ci/images/focal/Dockerfile
# Bullseye ships with Python 3.9.2.
- docker_image: zulip/ci:bullseye
name: Debian 11 Bullseye (Python 3.9, backend)
os: bullseye
is_bullseye: true
include_frontend_tests: false
runs-on: ubuntu-latest
name: ${{ matrix.name }}
container: ${{ matrix.docker_image }}
env:
# GitHub Actions sets HOME to /github/home which causes
# problem later in provison and frontend test that runs
# tools/setup/postgresql-init-dev-db because of the .pgpass
# location. PostgreSQL (psql) expects .pgpass to be at
# /home/github/.pgpass and setting home to `/home/github/`
# ensures it written there because we write it to ~/.pgpass.
HOME: /home/github/
steps:
- name: Add required permissions
run: |
# The checkout actions doesn't clone to ~/zulip or allow
# us to use the path option to clone outside the current
# /__w/zulip/zulip directory. Since this directory is owned
# by root we need to change it's ownership to allow the
# github user to clone the code here.
# Note: /__w/ is a docker volume mounted to $GITHUB_WORKSPACE
# which is /home/runner/work/.
sudo chown -R github .
# This is the GitHub Actions specific cache directory the
# the current github user must be able to access for the
# cache action to work. It is owned by root currently.
sudo chmod -R 0777 /__w/_temp/
- uses: actions/checkout@v2
- name: Create cache directories
run: |
dirs=(/srv/zulip-{npm,venv,emoji}-cache)
sudo mkdir -p "${dirs[@]}"
sudo chown -R github "${dirs[@]}"
- name: Restore node_modules cache
uses: actions/cache@v2
with:
path: /srv/zulip-npm-cache
key: v1-yarn-deps-${{ matrix.os }}-${{ hashFiles('package.json') }}-${{ hashFiles('yarn.lock') }}
restore-keys: v1-yarn-deps-${{ matrix.os }}
- name: Restore python cache
uses: actions/cache@v2
with:
path: /srv/zulip-venv-cache
key: v1-venv-${{ matrix.os }}-${{ hashFiles('requirements/dev.txt') }}
restore-keys: v1-venv-${{ matrix.os }}
- name: Restore emoji cache
uses: actions/cache@v2
with:
path: /srv/zulip-emoji-cache
key: v1-emoji-${{ matrix.os }}-${{ hashFiles('tools/setup/emoji/emoji_map.json') }}-${{ hashFiles('tools/setup/emoji/build_emoji') }}-${{ hashFiles('tools/setup/emoji/emoji_setup_utils.py') }}-${{ hashFiles('tools/setup/emoji/emoji_names.py') }}-${{ hashFiles('package.json') }}
restore-keys: v1-emoji-${{ matrix.os }}
- name: Do Bionic hack
if: ${{ matrix.is_bionic }}
run: |
# Temporary hack till `sudo service redis-server start` gets fixes in Bionic. See
# https://chat.zulip.org/#narrow/stream/3-backend/topic/Ubuntu.20bionic.20CircleCI
sudo sed -i '/^bind/s/bind.*/bind 0.0.0.0/' /etc/redis/redis.conf
- name: Install dependencies
run: |
# This is the main setup job for the test suite
./tools/ci/setup-backend --skip-dev-db-build
# Cleaning caches is mostly unnecessary in GitHub Actions, because
# most builds don't get to write to the cache.
# scripts/lib/clean-unused-caches --verbose --threshold 0
- name: Run tools test
run: |
source tools/ci/activate-venv
./tools/test-tools
- name: Run backend lint
run: |
source tools/ci/activate-venv
echo "Test suite is running under $(python --version)."
./tools/lint --groups=backend --skip=gitlint,mypy # gitlint disabled because flaky
- name: Run frontend lint
if: ${{ matrix.include_frontend_tests }}
run: |
source tools/ci/activate-venv
./tools/lint --groups=frontend --skip=gitlint # gitlint disabled because flaky
- name: Run backend tests
run: |
source tools/ci/activate-venv
./tools/test-backend --coverage --include-webhooks --no-cov-cleanup --ban-console-output
- name: Run mypy
run: |
source tools/ci/activate-venv
# We run mypy after the backend tests so we get output from the
# backend tests, which tend to uncover more serious problems, first.
./tools/run-mypy --version
./tools/run-mypy
- name: Run miscellaneous tests
run: |
source tools/ci/activate-venv
# Currently our compiled requirements files will differ for different python versions
# so we will run test-locked-requirements only for Bionic.
# ./tools/test-locked-requirements
# ./tools/test-run-dev # https://github.com/zulip/zulip/pull/14233
#
# This test has been persistently flaky at like 1% frequency, is slow,
# and is for a very specific single feature, so we don't run it by default:
# ./tools/test-queue-worker-reload
./tools/test-migrations
./tools/setup/optimize-svg --check
./tools/setup/generate_integration_bots_avatars.py --check-missing
- name: Run documentation and api tests
run: |
source tools/ci/activate-venv
# In CI, we only test links we control in test-documentation to avoid flakes
./tools/test-documentation --skip-external-links
./tools/test-help-documentation --skip-external-links
./tools/test-api
- name: Run node tests
if: ${{ matrix.include_frontend_tests }}
run: |
source tools/ci/activate-venv
# Run the node tests first, since they're fast and deterministic
./tools/test-js-with-node --coverage
- name: Check schemas
if: ${{ matrix.include_frontend_tests }}
run: |
source tools/ci/activate-venv
# Check that various schemas are consistent. (is fast)
./tools/check-schemas
- name: Check capitalization of strings
if: ${{ matrix.include_frontend_tests }}
run: |
source tools/ci/activate-venv
./manage.py makemessages --locale en
PYTHONWARNINGS=ignore ./tools/check-capitalization --no-generate
PYTHONWARNINGS=ignore ./tools/check-frontend-i18n --no-generate
- name: Run puppeteer tests
if: ${{ matrix.include_frontend_tests }}
run: |
source tools/ci/activate-venv
./tools/test-js-with-puppeteer
- name: Check for untracked files
run: |
source tools/ci/activate-venv
# This final check looks for untracked files that may have been
# created by test-backend or provision.
untracked="$(git ls-files --exclude-standard --others)"
if [ -n "$untracked" ]; then
printf >&2 "Error: untracked files:\n%s\n" "$untracked"
exit 1
fi
- name: Test locked requirements
if: ${{ matrix.is_bionic }}
run: |
. /srv/zulip-py3-venv/bin/activate && \
./tools/test-locked-requirements
- name: Upload coverage reports
# Only upload coverage when both frontend and backend
# tests are ran.
if: ${{ matrix.include_frontend_tests }}
run: |
# Codcov requires `.coverage` file to be stored in the
# current working directory.
mv ./var/.coverage ./.coverage
. /srv/zulip-py3-venv/bin/activate || true
pip install codecov && codecov || echo "Error in uploading coverage reports to codecov.io."
- name: Store Puppeteer artifacts
# Upload these on failure, as well
if: ${{ always() && matrix.include_frontend_tests }}
uses: actions/upload-artifact@v2
with:
name: puppeteer
path: ./var/puppeteer
retention-days: 60
- name: Check development database build
if: ${{ matrix.is_focal || matrix.is_bullseye }}
run: ./tools/ci/setup-backend
- name: Report status
if: failure()
env:
ZULIP_BOT_KEY: ${{ secrets.ZULIP_BOT_KEY }}
run: tools/ci/send-failure-message

22
.gitignore vendored
View File

@@ -27,21 +27,12 @@
package-lock.json package-lock.json
/.vagrant /.vagrant
/var/* /var
!/var/puppeteer
/var/puppeteer/*
!/var/puppeteer/test_credentials.d.ts
/.dmypy.json /.dmypy.json
# Dockerfiles generated for continuous integration # Dockerfiles generated for CircleCI
/tools/ci/images /tools/circleci/images
# Generated i18n data
/locale/en
/locale/language_options.json
/locale/language_name_map.json
/locale/*/mobile.json
# Static build # Static build
*.mo *.mo
@@ -51,7 +42,6 @@ npm-debug.log
/staticfiles.json /staticfiles.json
/webpack-stats-production.json /webpack-stats-production.json
/yarn-error.log /yarn-error.log
zulip-git-version
# Test / analysis tools # Test / analysis tools
.coverage .coverage
@@ -75,13 +65,11 @@ zulip.kdev4
*.sublime-workspace *.sublime-workspace
.vscode/ .vscode/
*.DS_Store *.DS_Store
# .cache/ is generated by Visual Studio Code's test runner # .cache/ is generated by VSCode's test runner
.cache/ .cache/
.eslintcache .eslintcache
# Core dump files
core
## Miscellaneous ## Miscellaneous
# (Ideally this section is empty.) # (Ideally this section is empty.)
zthumbor/thumbor_local_settings.py
.transifexrc .transifexrc

View File

@@ -1,9 +1,9 @@
[general] [general]
ignore=title-trailing-punctuation, body-min-length, body-is-missing ignore=title-trailing-punctuation, body-min-length, body-is-missing, title-imperative-mood
extra-path=tools/lib/gitlint-rules.py extra-path=tools/lib/gitlint-rules.py
[title-match-regex] [title-match-regex-allow-exception]
regex=^(.+:\ )?[A-Z].+\.$ regex=^(.+:\ )?[A-Z].+\.$
[title-max-length] [title-max-length]

10
.isort.cfg Normal file
View File

@@ -0,0 +1,10 @@
[settings]
line_length = 79
multi_line_output = 2
balanced_wrapping = true
known_third_party = django, ujson, sqlalchemy
known_first_party = zerver, zproject, version, confirmation, zilencer, analytics, frontend_tests, scripts, corporate
sections = FUTURE, STDLIB, THIRDPARTY, FIRSTPARTY, LOCALFOLDER
lines_after_imports = 1
# See the comment related to ioloop_logging for why this is skipped.
skip = zerver/management/commands/runtornado.py

View File

@@ -1,40 +0,0 @@
Alex Vandiver <alexmv@zulip.com> <alex@chmrr.net>
Alex Vandiver <alexmv@zulip.com> <github@chmrr.net>
Allen Rabinovich <allenrabinovich@yahoo.com> <allenr@humbughq.com>
Allen Rabinovich <allenrabinovich@yahoo.com> <allenr@zulip.com>
Aman Agrawal <amanagr@zulip.com> <f2016561@pilani.bits-pilani.ac.in>
Anders Kaseorg <anders@zulip.com> <anders@zulipchat.com>
Anders Kaseorg <anders@zulip.com> <andersk@mit.edu>
Brock Whittaker <brock@zulipchat.com> <bjwhitta@asu.edu>
Brock Whittaker <brock@zulipchat.com> <brockwhittaker@Brocks-MacBook.local>
Brock Whittaker <brock@zulipchat.com> <brock@zulipchat.org>
Chris Bobbe <cbobbe@zulip.com> <cbobbe@zulipchat.com>
Chris Bobbe <cbobbe@zulip.com> <csbobbe@gmail.com>
Greg Price <greg@zulip.com> <gnprice@gmail.com>
Greg Price <greg@zulip.com> <greg@zulipchat.com>
Greg Price <greg@zulip.com> <price@mit.edu>
Jeff Arnold <jbarnold@gmail.com> <jbarnold@humbughq.com>
Jeff Arnold <jbarnold@gmail.com> <jbarnold@zulip.com>
Jessica McKellar <jesstess@mit.edu> <jesstess@humbughq.com>
Jessica McKellar <jesstess@mit.edu> <jesstess@zulip.com>
Kevin Mehall <km@kevinmehall.net> <kevin@humbughq.com>
Kevin Mehall <km@kevinmehall.net> <kevin@zulip.com>
Ray Kraesig <rkraesig@zulip.com> <rkraesig@zulipchat.com>
Rishi Gupta <rishig@zulipchat.com> <rishig+git@mit.edu>
Rishi Gupta <rishig@zulipchat.com> <rishig@kandralabs.com>
Rishi Gupta <rishig@zulipchat.com> <rishig@users.noreply.github.com>
Reid Barton <rwbarton@gmail.com> <rwbarton@humbughq.com>
Scott Feeney <scott@oceanbase.org> <scott@humbughq.com>
Scott Feeney <scott@oceanbase.org> <scott@zulip.com>
Steve Howell <showell@zulip.com> <showell30@yahoo.com>
Steve Howell <showell@zulip.com> <showell@yahoo.com>
Steve Howell <showell@zulip.com> <showell@zulipchat.com>
Steve Howell <showell@zulip.com> <steve@humbughq.com>
Steve Howell <showell@zulip.com> <steve@zulip.com>
Tim Abbott <tabbott@zulip.com> <tabbott@dropbox.com>
Tim Abbott <tabbott@zulip.com> <tabbott@humbughq.com>
Tim Abbott <tabbott@zulip.com> <tabbott@mit.edu>
Tim Abbott <tabbott@zulip.com> <tabbott@zulipchat.com>
Vishnu KS <vishnu@zulip.com> <hackerkid@vishnuks.com>
Vishnu KS <vishnu@zulip.com> <yo@vishnuks.com>
Alya Abbott <alya@zulip.com> <alyaabbott@elance-odesk.com>

View File

@@ -1,8 +0,0 @@
/corporate/tests/stripe_fixtures
/locale
/static/third
/templates/**/*.md
/tools/setup/emoji/emoji_map.json
/zerver/tests/fixtures
/zerver/webhooks/*/doc.md
/zerver/webhooks/*/fixtures

View File

@@ -1,15 +0,0 @@
{
"source_directories": ["."],
"taint_models_path": [
"stubs/taint",
"zulip-py3-venv/lib/pyre_check/taint/"
],
"search_path": [
"stubs/",
"zulip-py3-venv/lib/pyre_check/stubs/"
],
"typeshed": "zulip-py3-venv/lib/pyre_check/typeshed/",
"exclude": [
"/srv/zulip/zulip-py3-venv/.*"
]
}

View File

@@ -1 +0,0 @@
sonar.inclusions=**/*.py,**/*.html

67
.stylelintrc Normal file
View File

@@ -0,0 +1,67 @@
{
"rules": {
# Stylistic rules for CSS.
"function-comma-space-after": "always",
"function-comma-space-before": "never",
"function-max-empty-lines": 0,
"function-whitespace-after": "always",
"value-keyword-case": "lower",
"value-list-comma-newline-after": "always-multi-line",
"value-list-comma-space-after": "always-single-line",
"value-list-comma-space-before": "never",
"value-list-max-empty-lines": 0,
"unit-case": "lower",
"property-case": "lower",
"color-hex-case": "lower",
"declaration-bang-space-before": "always",
"declaration-colon-newline-after": "always-multi-line",
"declaration-colon-space-after": "always-single-line",
"declaration-colon-space-before": "never",
"declaration-block-semicolon-newline-after": "always",
"declaration-block-semicolon-space-before": "never",
"declaration-block-trailing-semicolon": "always",
"block-closing-brace-empty-line-before": "never",
"block-closing-brace-newline-after": "always",
"block-closing-brace-newline-before": "always",
"block-opening-brace-newline-after": "always",
"block-opening-brace-space-before": "always",
"selector-attribute-brackets-space-inside": "never",
"selector-attribute-operator-space-after": "never",
"selector-attribute-operator-space-before": "never",
"selector-combinator-space-after": "always",
"selector-combinator-space-before": "always",
"selector-descendant-combinator-no-non-space": true,
"selector-pseudo-class-parentheses-space-inside": "never",
"selector-pseudo-element-case": "lower",
"selector-pseudo-element-colon-notation": "double",
"selector-type-case": "lower",
"selector-list-comma-newline-after": "always",
"selector-list-comma-space-before": "never",
"media-feature-colon-space-after": "always",
"media-feature-colon-space-before": "never",
"media-feature-name-case": "lower",
"media-feature-parentheses-space-inside": "never",
"media-feature-range-operator-space-after": "always",
"media-feature-range-operator-space-before": "always",
"media-query-list-comma-newline-after": "always",
"media-query-list-comma-space-before": "never",
"at-rule-name-case": "lower",
"at-rule-name-space-after": "always",
"at-rule-semicolon-newline-after": "always",
"at-rule-semicolon-space-before": "never",
"comment-whitespace-inside": "always",
"indentation": 4,
# Limit language features
"color-no-hex": true,
"color-named": "never",
}
}

67
.travis.yml Normal file
View File

@@ -0,0 +1,67 @@
# See https://zulip.readthedocs.io/en/latest/testing/continuous-integration.html for
# high-level documentation on our Travis CI setup.
dist: trusty
group: deprecated-2017Q4
install:
# Disable sometimes-broken sources.list in Travis base images
- sudo rm -vf /etc/apt/sources.list.d/*
- sudo apt-get update
# Disable Travis CI's built-in NVM installation
- mispipe "mv ~/.nvm ~/.travis-nvm-disabled" ts
# Install codecov, the library for the code coverage reporting tool we use
# With a retry to minimize impact of transient networking errors.
- mispipe "pip install codecov" ts || mispipe "pip install codecov" ts
# This is the main setup job for the test suite
- mispipe "tools/ci/setup-$TEST_SUITE" ts
# Clean any caches that are not in use to avoid our cache
# becoming huge.
- mispipe "scripts/lib/clean-unused-caches --verbose --threshold 0" ts
script:
# We unset GEM_PATH here as a hack to work around Travis CI having
# broken running their system puppet with Ruby. See
# https://travis-ci.org/zulip/zulip/jobs/240120991 for an example traceback.
- unset GEM_PATH
- mispipe "./tools/ci/$TEST_SUITE" ts
cache:
yarn: true
apt: false
directories:
- $HOME/zulip-venv-cache
- $HOME/zulip-npm-cache
- $HOME/zulip-emoji-cache
- $HOME/node
- $HOME/misc
env:
global:
- BOTO_CONFIG=/nonexistent
language: python
# Our test suites generally run on Python 3.4, the version in
# Ubuntu 14.04 trusty, which is the oldest OS release we support.
matrix:
include:
# Travis will actually run the jobs in the order they're listed here;
# that doesn't seem to be documented, but it's what we see empirically.
# We only get 4 jobs running at a time, so we try to make the first few
# the most likely to break.
- python: "3.4"
env: TEST_SUITE=production
# Other suites moved to CircleCI -- see .circleci/.
sudo: required
addons:
artifacts:
paths:
# Casper debugging data (screenshots, etc.) is super useful for
# debugging test flakes.
- $(ls var/casper/* | tr "\n" ":")
- $(ls /tmp/zulip-test-event-log/* | tr "\n" ":")
postgresql: "9.3"
apt:
packages:
- moreutils
after_success:
- codecov

View File

@@ -3,31 +3,31 @@ host = https://www.transifex.com
lang_map = zh-Hans: zh_Hans, zh-Hant: zh_Hant lang_map = zh-Hans: zh_Hans, zh-Hant: zh_Hant
[zulip.djangopo] [zulip.djangopo]
file_filter = locale/<lang>/LC_MESSAGES/django.po source_file = static/locale/en/LC_MESSAGES/django.po
source_file = locale/en/LC_MESSAGES/django.po
source_lang = en source_lang = en
type = PO type = PO
file_filter = static/locale/<lang>/LC_MESSAGES/django.po
[zulip.translationsjson] [zulip.translationsjson]
file_filter = locale/<lang>/translations.json source_file = static/locale/en/translations.json
source_file = locale/en/translations.json
source_lang = en source_lang = en
type = KEYVALUEJSON type = KEYVALUEJSON
file_filter = static/locale/<lang>/translations.json
[zulip.mobile] [zulip.mobile]
file_filter = locale/<lang>/mobile.json source_file = static/locale/en/mobile.json
source_file = locale/en/mobile.json
source_lang = en source_lang = en
type = KEYVALUEJSON type = KEYVALUEJSON
file_filter = static/locale/<lang>/mobile.json
[zulip-test.djangopo] [zulip-test.djangopo]
file_filter = locale/<lang>/LC_MESSAGES/django.po source_file = static/locale/en/LC_MESSAGES/django.po
source_file = locale/en/LC_MESSAGES/django.po
source_lang = en source_lang = en
type = PO type = PO
file_filter = static/locale/<lang>/LC_MESSAGES/django.po
[zulip-test.translationsjson] [zulip-test.translationsjson]
file_filter = locale/<lang>/translations.json source_file = static/locale/en/translations.json
source_file = locale/en/translations.json
source_lang = en source_lang = en
type = KEYVALUEJSON type = KEYVALUEJSON
file_filter = static/locale/<lang>/translations.json

View File

@@ -1 +0,0 @@
ignore-scripts true

View File

@@ -14,46 +14,46 @@ This isn't an exhaustive list of things that you can't do. Rather, take it
in the spirit in which it's intended --- a guide to make it easier to enrich in the spirit in which it's intended --- a guide to make it easier to enrich
all of us and the technical communities in which we participate. all of us and the technical communities in which we participate.
## Expected behavior ## Expected Behavior
The following behaviors are expected and requested of all community members: The following behaviors are expected and requested of all community members:
- Participate. In doing so, you contribute to the health and longevity of * Participate. In doing so, you contribute to the health and longevity of
the community. the community.
- Exercise consideration and respect in your speech and actions. * Exercise consideration and respect in your speech and actions.
- Attempt collaboration before conflict. Assume good faith. * Attempt collaboration before conflict. Assume good faith.
- Refrain from demeaning, discriminatory, or harassing behavior and speech. * Refrain from demeaning, discriminatory, or harassing behavior and speech.
- Take action or alert community leaders if you notice a dangerous * Take action or alert community leaders if you notice a dangerous
situation, someone in distress, or violations of this code, even if they situation, someone in distress, or violations of this code, even if they
seem inconsequential. seem inconsequential.
- Community event venues may be shared with members of the public; be * Community event venues may be shared with members of the public; be
respectful to all patrons of these locations. respectful to all patrons of these locations.
## Unacceptable behavior ## Unacceptable Behavior
The following behaviors are considered harassment and are unacceptable The following behaviors are considered harassment and are unacceptable
within the Zulip community: within the Zulip community:
- Jokes or derogatory language that singles out members of any race, * Jokes or derogatory language that singles out members of any race,
ethnicity, culture, national origin, color, immigration status, social and ethnicity, culture, national origin, color, immigration status, social and
economic class, educational level, language proficiency, sex, sexual economic class, educational level, language proficiency, sex, sexual
orientation, gender identity and expression, age, size, family status, orientation, gender identity and expression, age, size, family status,
political belief, religion, and mental and physical ability. political belief, religion, and mental and physical ability.
- Violence, threats of violence, or violent language directed against * Violence, threats of violence, or violent language directed against
another person. another person.
- Disseminating or threatening to disseminate another person's personal * Disseminating or threatening to disseminate another person's personal
information. information.
- Personal insults of any sort. * Personal insults of any sort.
- Posting or displaying sexually explicit or violent material. * Posting or displaying sexually explicit or violent material.
- Inappropriate photography or recording. * Inappropriate photography or recording.
- Deliberate intimidation, stalking, or following (online or in person). * Deliberate intimidation, stalking, or following (online or in person).
- Unwelcome sexual attention. This includes sexualized comments or jokes, * Unwelcome sexual attention. This includes sexualized comments or jokes,
inappropriate touching or groping, and unwelcomed sexual advances. inappropriate touching or groping, and unwelcomed sexual advances.
- Sustained disruption of community events, including talks and * Sustained disruption of community events, including talks and
presentations. presentations.
- Advocating for, or encouraging, any of the behaviors above. * Advocating for, or encouraging, any of the behaviors above.
## Reporting and enforcement ## Reporting and Enforcement
Harassment and other code of conduct violations reduce the value of the Harassment and other code of conduct violations reduce the value of the
community for everyone. If someone makes you or anyone else feel unsafe or community for everyone. If someone makes you or anyone else feel unsafe or
@@ -78,7 +78,7 @@ something you can do while a violation is happening, do it. A lot of the
harms of harassment and other violations can be mitigated by the victim harms of harassment and other violations can be mitigated by the victim
knowing that the other people present are on their side. knowing that the other people present are on their side.
All reports will be kept confidential. In some cases, we may determine that a All reports will be kept confidential. In some cases we may determine that a
public statement will need to be made. In such cases, the identities of all public statement will need to be made. In such cases, the identities of all
victims and reporters will remain confidential unless those individuals victims and reporters will remain confidential unless those individuals
instruct us otherwise. instruct us otherwise.
@@ -95,10 +95,11 @@ behavior occurring outside the scope of community activities when such
behavior has the potential to adversely affect the safety and well-being of behavior has the potential to adversely affect the safety and well-being of
community members. community members.
## License and attribution ## License and Attribution
This Code of Conduct is adapted from the This Code of Conduct is adapted from the
[Citizen Code of Conduct](http://citizencodeofconduct.org/) and the
[Django Code of Conduct](https://www.djangoproject.com/conduct/), and is [Django Code of Conduct](https://www.djangoproject.com/conduct/), and is
under a under a
[Creative Commons BY-SA](https://creativecommons.org/licenses/by-sa/4.0/) [Creative Commons BY-SA](http://creativecommons.org/licenses/by-sa/4.0/)
license. license.

View File

@@ -13,12 +13,10 @@ user, or anything else. Make sure to read the
before posting. The Zulip community is also governed by a before posting. The Zulip community is also governed by a
[code of conduct](https://zulip.readthedocs.io/en/latest/code-of-conduct.html). [code of conduct](https://zulip.readthedocs.io/en/latest/code-of-conduct.html).
You can subscribe to You can subscribe to zulip-devel@googlegroups.com for a lower traffic (~1
[zulip-devel-announce@googlegroups.com](https://groups.google.com/g/zulip-devel-announce) email/month) way to hear about things like mentorship opportunities with Google
or our [Twitter](https://twitter.com/zulip) account for a very low Code-in, in-person sprints at conferences, and other opportunities to
traffic (<1 email/month) way to hear about things like mentorship contribute.
opportunities with Google Summer of Code, in-person sprints at
conferences, and other opportunities to contribute.
## Ways to contribute ## Ways to contribute
@@ -26,31 +24,29 @@ To make a code or documentation contribution, read our
[step-by-step guide](#your-first-codebase-contribution) to getting [step-by-step guide](#your-first-codebase-contribution) to getting
started with the Zulip codebase. A small sample of the type of work that started with the Zulip codebase. A small sample of the type of work that
needs doing: needs doing:
* Bug squashing and feature development on our Python/Django
- Bug squashing and feature development on our Python/Django
[backend](https://github.com/zulip/zulip), web [backend](https://github.com/zulip/zulip), web
[frontend](https://github.com/zulip/zulip), React Native [frontend](https://github.com/zulip/zulip), React Native
[mobile app](https://github.com/zulip/zulip-mobile), or Electron [mobile app](https://github.com/zulip/zulip-mobile), or Electron
[desktop app](https://github.com/zulip/zulip-desktop). [desktop app](https://github.com/zulip/zulip-electron).
- Building out our * Building out our
[Python API and bots](https://github.com/zulip/python-zulip-api) framework. [Python API and bots](https://github.com/zulip/python-zulip-api) framework.
- [Writing an integration](https://zulip.com/api/integrations-overview). * [Writing an integration](https://zulipchat.com/api/integrations-overview).
- Improving our [user](https://zulip.com/help/) or * Improving our [user](https://zulipchat.com/help/) or
[developer](https://zulip.readthedocs.io/en/latest/) documentation. [developer](https://zulip.readthedocs.io/en/latest/) documentation.
- [Reviewing code](https://zulip.readthedocs.io/en/latest/contributing/code-reviewing.html) * [Reviewing code](https://zulip.readthedocs.io/en/latest/contributing/code-reviewing.html)
and manually testing pull requests. and manually testing pull requests.
**Non-code contributions**: Some of the most valuable ways to contribute **Non-code contributions**: Some of the most valuable ways to contribute
don't require touching the codebase at all. We list a few of them below: don't require touching the codebase at all. We list a few of them below:
- [Reporting issues](#reporting-issues), including both feature requests and * [Reporting issues](#reporting-issues), including both feature requests and
bug reports. bug reports.
- [Giving feedback](#user-feedback) if you are evaluating or using Zulip. * [Giving feedback](#user-feedback) if you are evaluating or using Zulip.
- [Sponsor Zulip](https://github.com/sponsors/zulip) through the GitHub sponsors program. * [Translating](https://zulip.readthedocs.io/en/latest/translating/translating.html)
- [Translating](https://zulip.readthedocs.io/en/latest/translating/translating.html)
Zulip. Zulip.
- [Outreach](#zulip-outreach): Star us on GitHub, upvote us * [Outreach](#zulip-outreach): Star us on GitHub, upvote us
on product comparison sites, or write for [the Zulip blog](https://blog.zulip.org/). on product comparison sites, or write for [the Zulip blog](http://blog.zulip.org/).
## Your first (codebase) contribution ## Your first (codebase) contribution
@@ -58,8 +54,7 @@ This section has a step by step guide to starting as a Zulip codebase
contributor. It's long, but don't worry about doing all the steps perfectly; contributor. It's long, but don't worry about doing all the steps perfectly;
no one gets it right the first time, and there are a lot of people available no one gets it right the first time, and there are a lot of people available
to help. to help.
* First, make an account on the
- First, make an account on the
[Zulip community server](https://zulip.readthedocs.io/en/latest/contributing/chat-zulip-org.html), [Zulip community server](https://zulip.readthedocs.io/en/latest/contributing/chat-zulip-org.html),
paying special attention to the community norms. If you'd like, introduce paying special attention to the community norms. If you'd like, introduce
yourself in yourself in
@@ -67,18 +62,20 @@ to help.
your name as the topic. Bonus: tell us about your first impressions of your name as the topic. Bonus: tell us about your first impressions of
Zulip, and anything that felt confusing/broken as you started using the Zulip, and anything that felt confusing/broken as you started using the
product. product.
- Read [What makes a great Zulip contributor](#what-makes-a-great-zulip-contributor). * Read [What makes a great Zulip contributor](#what-makes-a-great-zulip-contributor).
- [Install the development environment](https://zulip.readthedocs.io/en/latest/development/overview.html), * [Install the development environment](https://zulip.readthedocs.io/en/latest/development/overview.html),
getting help in getting help in
[#development help](https://chat.zulip.org/#narrow/stream/49-development-help) [#development help](https://chat.zulip.org/#narrow/stream/49-development-help)
if you run into any troubles. if you run into any troubles.
- Read the * Read the
[Zulip guide to Git](https://zulip.readthedocs.io/en/latest/git/index.html) [Zulip guide to Git](https://zulip.readthedocs.io/en/latest/git/index.html)
and do the Git tutorial (coming soon) if you are unfamiliar with and do the Git tutorial (coming soon) if you are unfamiliar with
Git, getting help in Git, getting help in
[#git help](https://chat.zulip.org/#narrow/stream/44-git-help) if [#git help](https://chat.zulip.org/#narrow/stream/44-git-help) if
you run into any troubles. Be sure to check out the you run into any troubles. Be sure to check out the
[extremely useful Zulip-specific tools page](https://zulip.readthedocs.io/en/latest/git/zulip-tools.html). [extremely useful Zulip-specific tools page](https://zulip.readthedocs.io/en/latest/git/zulip-tools.html).
* Sign the
[Dropbox Contributor License Agreement](https://opensource.dropbox.com/cla/).
### Picking an issue ### Picking an issue
@@ -86,32 +83,30 @@ Now, you're ready to pick your first issue! There are hundreds of open issues
in the main codebase alone. This section will help you find an issue to work in the main codebase alone. This section will help you find an issue to work
on. on.
- If you're interested in * If you're interested in
[mobile](https://github.com/zulip/zulip-mobile/issues?q=is%3Aopen+is%3Aissue), [mobile](https://github.com/zulip/zulip-mobile/issues?q=is%3Aopen+is%3Aissue),
[desktop](https://github.com/zulip/zulip-desktop/issues?q=is%3Aopen+is%3Aissue), [desktop](https://github.com/zulip/zulip-electron/issues?q=is%3Aopen+is%3Aissue),
or or
[bots](https://github.com/zulip/python-zulip-api/issues?q=is%3Aopen+is%3Aissue) [bots](https://github.com/zulip/python-zulip-api/issues?q=is%3Aopen+is%3Aissue)
development, check the respective links for open issues, or post in development, check the respective links for open issues, or post in
[#mobile](https://chat.zulip.org/#narrow/stream/48-mobile), [#mobile](https://chat.zulip.org/#narrow/stream/48-mobile),
[#desktop](https://chat.zulip.org/#narrow/stream/16-desktop), or [#desktop](https://chat.zulip.org/#narrow/stream/16-desktop), or
[#integration](https://chat.zulip.org/#narrow/stream/127-integrations). [#integration](https://chat.zulip.org/#narrow/stream/127-integrations).
- For the main server and web repository, we recommend browsing * For the main server and web repository, start by looking through issues
recently opened issues to look for issues you are confident you can with the label
fix correctly in a way that clearly communicates why your changes [good first issue](https://github.com/zulip/zulip/issues?q=is%3Aopen+is%3Aissue+label%3A"good+first+issue").
are the correct fix. Our GitHub workflow bot, zulipbot, limits These are smaller projects particularly suitable for a first contribution.
users who have 0 commits merged to claiming a single issue labeled * We also partition all of our issues in the main repo into areas like
with "good first issue" or "help wanted".
- We also partition all of our issues in the main repo into areas like
admin, compose, emoji, hotkeys, i18n, onboarding, search, etc. Look admin, compose, emoji, hotkeys, i18n, onboarding, search, etc. Look
through our [list of labels](https://github.com/zulip/zulip/labels), and through our [list of labels](https://github.com/zulip/zulip/labels), and
click on some of the `area:` labels to see all the issues related to your click on some of the `area:` labels to see all the issues related to your
areas of interest. areas of interest.
- If the lists of issues are overwhelming, post in * If the lists of issues are overwhelming, post in
[#new members](https://chat.zulip.org/#narrow/stream/95-new-members) with a [#new members](https://chat.zulip.org/#narrow/stream/95-new-members) with a
bit about your background and interests, and we'll help you out. The most bit about your background and interests, and we'll help you out. The most
important thing to say is whether you're looking for a backend (Python), important thing to say is whether you're looking for a backend (Python),
frontend (JavaScript and TypeScript), mobile (React Native), desktop (Electron), frontend (JavaScript), mobile (React Native), desktop (Electron),
documentation (English) or visual design (JavaScript/TypeScript + CSS) issue, and a documentation (English) or visual design (JavaScript + CSS) issue, and a
bit about your programming experience and available time. bit about your programming experience and available time.
We also welcome suggestions of features that you feel would be valuable or We also welcome suggestions of features that you feel would be valuable or
@@ -121,22 +116,15 @@ have a new feature you'd like to add, we recommend you start by posting in
feature idea and the problem that you're hoping to solve. feature idea and the problem that you're hoping to solve.
Other notes: Other notes:
* For a first pull request, it's better to aim for a smaller contribution
- For a first pull request, it's better to aim for a smaller contribution
than a bigger one. Many first contributions have fewer than 10 lines of than a bigger one. Many first contributions have fewer than 10 lines of
changes (not counting changes to tests). changes (not counting changes to tests).
- The full list of issues explicitly looking for a contributor can be * The full list of issues looking for a contributor can be found with the
found with the
[good first issue](https://github.com/zulip/zulip/issues?q=is%3Aopen+is%3Aissue+label%3A%22good+first+issue%22) [good first issue](https://github.com/zulip/zulip/issues?q=is%3Aopen+is%3Aissue+label%3A%22good+first+issue%22)
and and
[help wanted](https://github.com/zulip/zulip/issues?q=is%3Aopen+is%3Aissue+label%3A%22help+wanted%22) [help wanted](https://github.com/zulip/zulip/issues?q=is%3Aopen+is%3Aissue+label%3A%22help+wanted%22)
labels. Avoid issues with the "difficult" label unless you labels.
understand why it is difficult and are confident you can resolve the * For most new contributors, there's a lot to learn while making your first
issue correctly and completely. Issues without one of these labels
are fair game if Tim has written a clear technical design proposal
in the issue, or it is a bug that you can reproduce and you are
confident you can fix the issue correctly.
- For most new contributors, there's a lot to learn while making your first
pull request. It's OK if it takes you a while; that's normal! You'll be pull request. It's OK if it takes you a while; that's normal! You'll be
able to work a lot faster as you build experience. able to work a lot faster as you build experience.
@@ -147,20 +135,20 @@ the issue thread. [Zulipbot](https://github.com/zulip/zulipbot) is a GitHub
workflow bot; it will assign you to the issue and label the issue as "in workflow bot; it will assign you to the issue and label the issue as "in
progress". Some additional notes: progress". Some additional notes:
- You can only claim issues with the * You can only claim issues with the
[good first issue](https://github.com/zulip/zulip/issues?q=is%3Aopen+is%3Aissue+label%3A%22good+first+issue%22) [good first issue](https://github.com/zulip/zulip/issues?q=is%3Aopen+is%3Aissue+label%3A%22good+first+issue%22)
or or
[help wanted](https://github.com/zulip/zulip/issues?q=is%3Aopen+is%3Aissue+label%3A%22help+wanted%22) [help wanted](https://github.com/zulip/zulip/issues?q=is%3Aopen+is%3Aissue+label%3A%22help+wanted%22)
labels. Zulipbot will give you an error if you try to claim an issue labels. Zulipbot will give you an error if you try to claim an issue
without one of those labels. without one of those labels.
- You're encouraged to ask questions on how to best implement or debug your * You're encouraged to ask questions on how to best implement or debug your
changes -- the Zulip maintainers are excited to answer questions to help changes -- the Zulip maintainers are excited to answer questions to help
you stay unblocked and working efficiently. You can ask questions on you stay unblocked and working efficiently. You can ask questions on
chat.zulip.org, or on the GitHub issue or pull request. chat.zulip.org, or on the GitHub issue or pull request.
- We encourage early pull requests for work in progress. Prefix the title of * We encourage early pull requests for work in progress. Prefix the title of
work in progress pull requests with `[WIP]`, and remove the prefix when work in progress pull requests with `[WIP]`, and remove the prefix when
you think it might be mergeable and want it to be reviewed. you think it might be mergeable and want it to be reviewed.
- After updating a PR, add a comment to the GitHub thread mentioning that it * After updating a PR, add a comment to the GitHub thread mentioning that it
is ready for another review. GitHub only notifies maintainers of the is ready for another review. GitHub only notifies maintainers of the
changes when you post a comment, so if you don't, your PR will likely be changes when you post a comment, so if you don't, your PR will likely be
neglected by accident! neglected by accident!
@@ -175,29 +163,30 @@ labels.
## What makes a great Zulip contributor? ## What makes a great Zulip contributor?
Zulip has a lot of experience working with new contributors. In our Zulip runs a lot of [internship programs](#internship-programs), so we have
experience, these are the best predictors of success: a lot of experience with new contributors. In our experience, these are the
best predictors of success:
- Posting good questions. This generally means explaining your current * Posting good questions. This generally means explaining your current
understanding, saying what you've done or tried so far, and including understanding, saying what you've done or tried so far, and including
tracebacks or other error messages if appropriate. tracebacks or other error messages if appropriate.
- Learning and practicing * Learning and practicing
[Git commit discipline](https://zulip.readthedocs.io/en/latest/contributing/version-control.html#commit-discipline). [Git commit discipline](https://zulip.readthedocs.io/en/latest/contributing/version-control.html#commit-discipline).
- Submitting carefully tested code. This generally means checking your work * Submitting carefully tested code. This generally means checking your work
through a combination of automated tests and manually clicking around the through a combination of automated tests and manually clicking around the
UI trying to find bugs in your work. See UI trying to find bugs in your work. See
[things to look for](https://zulip.readthedocs.io/en/latest/contributing/code-reviewing.html#things-to-look-for) [things to look for](https://zulip.readthedocs.io/en/latest/contributing/code-reviewing.html#things-to-look-for)
for additional ideas. for additional ideas.
- Posting * Posting
[screenshots or GIFs](https://zulip.readthedocs.io/en/latest/tutorials/screenshot-and-gif-software.html) [screenshots or GIFs](https://zulip.readthedocs.io/en/latest/tutorials/screenshot-and-gif-software.html)
for frontend changes. for frontend changes.
- Being responsive to feedback on pull requests. This means incorporating or * Being responsive to feedback on pull requests. This means incorporating or
responding to all suggested changes, and leaving a note if you won't be responding to all suggested changes, and leaving a note if you won't be
able to address things within a few days. able to address things within a few days.
- Being helpful and friendly on chat.zulip.org. * Being helpful and friendly on chat.zulip.org.
These are also the main criteria we use to select candidates for all These are also the main criteria we use to select interns for all of our
of our outreach programs. internship programs.
## Reporting issues ## Reporting issues
@@ -218,9 +207,8 @@ and how to reproduce it if known, your browser/OS if relevant, and a
if appropriate. if appropriate.
**Reporting security issues**. Please do not report security issues **Reporting security issues**. Please do not report security issues
publicly, including on public streams on chat.zulip.org. You can publicly, including on public streams on chat.zulip.org. You can email
email security@zulip.com. We create a CVE for every security zulip-security@googlegroups.com. We create a CVE for every security issue.
issue in our released software.
## User feedback ## User feedback
@@ -230,63 +218,67 @@ hear about your experience with the product. If you're not sure what to
write, here are some questions we're always very curious to know the answer write, here are some questions we're always very curious to know the answer
to: to:
- Evaluation: What is the process by which your organization chose or will * Evaluation: What is the process by which your organization chose or will
choose a group chat product? choose a group chat product?
- Pros and cons: What are the pros and cons of Zulip for your organization, * Pros and cons: What are the pros and cons of Zulip for your organization,
and the pros and cons of other products you are evaluating? and the pros and cons of other products you are evaluating?
- Features: What are the features that are most important for your * Features: What are the features that are most important for your
organization? In the best-case scenario, what would your chat solution do organization? In the best case scenario, what would your chat solution do
for you? for you?
- Onboarding: If you remember it, what was your impression during your first * Onboarding: If you remember it, what was your impression during your first
few minutes of using Zulip? What did you notice, and how did you feel? Was few minutes of using Zulip? What did you notice, and how did you feel? Was
there anything that stood out to you as confusing, or broken, or great? there anything that stood out to you as confusing, or broken, or great?
- Organization: What does your organization do? How big is the organization? * Organization: What does your organization do? How big is the organization?
A link to your organization's website? A link to your organization's website?
## Outreach programs ## Internship programs
Zulip participates in [Google Summer of Code Zulip runs internship programs with
(GSoC)](https://developers.google.com/open-source/gsoc/) every year. [Outreachy](https://www.outreachy.org/),
In the past, we've also participated in [Google Summer of Code (GSoC)](https://developers.google.com/open-source/gsoc/)
[Outreachy](https://www.outreachy.org/), [Google [1], and the
Code-In](https://developers.google.com/open-source/gci/), and hosted [MIT Externship program](https://alum.mit.edu/students/NetworkwithAlumni/ExternshipProgram),
summer interns from Harvard, MIT, and Stanford. and has in the past taken summer interns from Harvard, MIT, and
Stanford.
While each third-party program has its own rules and requirements, the While each third-party program has its own rules and requirements, the
Zulip community's approaches all of these programs with these ideas in Zulip community's approaches all of these programs with these ideas in
mind: mind:
* We try to make the application process as valuable for the applicant as
- We try to make the application process as valuable for the applicant as possible. Expect high quality code reviews, a supportive community, and
possible. Expect high-quality code reviews, a supportive community, and
publicly viewable patches you can link to from your resume, regardless of publicly viewable patches you can link to from your resume, regardless of
whether you are selected. whether you are selected.
- To apply, you'll have to submit at least one pull request to a Zulip * To apply, you'll have to submit at least one pull request to a Zulip
repository. Most students accepted to one of our programs have repository. Most students accepted to one of our programs have
several merged pull requests (including at least one larger PR) by several merged pull requests (including at least one larger PR) by
the time of the application deadline. the time of the application deadline.
- The main criteria we use is quality of your best contributions, and * The main criteria we use is quality of your best contributions, and
the bullets listed at the bullets listed at
[What makes a great Zulip contributor](#what-makes-a-great-zulip-contributor). [What makes a great Zulip contributor](#what-makes-a-great-zulip-contributor).
Because we focus on evaluating your best work, it doesn't hurt your Because we focus on evaluating your best work, it doesn't hurt your
application to makes mistakes in your first few PRs as long as your application to makes mistakes in your first few PRs as long as your
work improves. work improves.
Most of our outreach program participants end up sticking around the Zulip also participates in
project long-term, and many have become core team members, maintaining [Google Code-In](https://developers.google.com/open-source/gci/). Our
important parts of the project. We hope you apply! selection criteria for Finalists and Grand Prize Winners is the same as our
selection criteria for interns above.
Most of our interns end up sticking around the project long-term, and many
quickly become core team members. We hope you apply!
### Google Summer of Code ### Google Summer of Code
The largest outreach program Zulip participates in is GSoC (14 GSoC is by far the largest of our internship programs (we had 14 GSoC
students in 2017; 11 in 2018; 17 in 2019; 18 in 2020). While we don't control how students in summer 2017). While we don't control how many slots
many slots Google allocates to Zulip, we hope to mentor a similar Google allocates to Zulip, we hope to mentor a similar number of
number of students in future summers. students in 2018.
If you're reading this well before the application deadline and want If you're reading this well before the application deadline and want
to make your application strong, we recommend getting involved in the to make your application strong, we recommend getting involved in the
community and fixing issues in Zulip now. Having good contributions community and fixing issues in Zulip now. Having good contributions
and building a reputation for doing good work is the best way to have and building a reputation for doing good work is best way to have a
a strong application. About half of Zulip's GSoC students for Summer strong application. About half of Zulip's GSoC students for Summer
2017 had made significant contributions to the project by February 2017 had made significant contributions to the project by February
2017, and about half had not. Our 2017, and about half had not. Our
[GSoC project ideas page][gsoc-guide] has lots more details on how [GSoC project ideas page][gsoc-guide] has lots more details on how
@@ -302,33 +294,36 @@ same as with GSoC, and it has no separate application process; your
GSoC application is your ZSoC application. If we'd like to select you GSoC application is your ZSoC application. If we'd like to select you
for ZSoC, we'll contact you when the GSoC results are announced. for ZSoC, we'll contact you when the GSoC results are announced.
[gsoc-guide]: https://zulip.readthedocs.io/en/latest/contributing/gsoc-ideas.html [gsoc-guide]: https://zulip.readthedocs.io/en/latest/overview/gsoc-ideas.html
[gsoc-faq]: https://developers.google.com/open-source/gsoc/faq [gsoc-faq]: https://developers.google.com/open-source/gsoc/faq
## Zulip outreach [1] Formally, [GSoC isn't an internship][gsoc-faq], but it is similar
enough that we're treating it as such for the purposes of this
documentation.
## Zulip Outreach
**Upvoting Zulip**. Upvotes and reviews make a big difference in the public **Upvoting Zulip**. Upvotes and reviews make a big difference in the public
perception of projects like Zulip. We've collected a few sites below perception of projects like Zulip. We've collected a few sites below
where we know Zulip has been discussed. Doing everything in the following where we know Zulip has been discussed. Doing everything in the following
list typically takes about 15 minutes. list typically takes about 15 minutes.
* Star us on GitHub. There are four main repositories:
- Star us on GitHub. There are four main repositories:
[server/web](https://github.com/zulip/zulip), [server/web](https://github.com/zulip/zulip),
[mobile](https://github.com/zulip/zulip-mobile), [mobile](https://github.com/zulip/zulip-mobile),
[desktop](https://github.com/zulip/zulip-desktop), and [desktop](https://github.com/zulip/zulip-electron), and
[Python API](https://github.com/zulip/python-zulip-api). [Python API](https://github.com/zulip/python-zulip-api).
- [Follow us](https://twitter.com/zulip) on Twitter. * [Follow us](https://twitter.com/zulip) on Twitter.
For both of the following, you'll need to make an account on the site if you For both of the following, you'll need to make an account on the site if you
don't already have one. don't already have one.
- [Like Zulip](https://alternativeto.net/software/zulip-chat-server/) on * [Like Zulip](https://alternativeto.net/software/zulip-chat-server/) on
AlternativeTo. We recommend upvoting a couple of other products you like AlternativeTo. We recommend upvoting a couple of other products you like
as well, both to give back to their community, and since single-upvote as well, both to give back to their community, and since single-upvote
accounts are generally given less weight. You can also accounts are generally given less weight. You can also
[upvote Zulip](https://alternativeto.net/software/slack/) on their page [upvote Zulip](https://alternativeto.net/software/slack/) on their page
for Slack. for Slack.
- [Add Zulip to your stack](https://stackshare.io/zulip) on StackShare, star * [Add Zulip to your stack](https://stackshare.io/zulip) on StackShare, star
it, and upvote the reasons why people like Zulip that you find most it, and upvote the reasons why people like Zulip that you find most
compelling. Again, we recommend adding a few other products that you like compelling. Again, we recommend adding a few other products that you like
as well. as well.
@@ -340,7 +335,7 @@ have been using Zulip for a while and want to contribute more.
about a technical aspect of Zulip can be a great way to spread the word about a technical aspect of Zulip can be a great way to spread the word
about Zulip. about Zulip.
We also occasionally [publish](https://blog.zulip.org/) long-form We also occasionally [publish](http://blog.zulip.org/) longer form
articles related to Zulip. Our posts typically get tens of thousands articles related to Zulip. Our posts typically get tens of thousands
of views, and we always have good ideas for blog posts that we can of views, and we always have good ideas for blog posts that we can
outline but don't have time to write. If you are an experienced writer outline but don't have time to write. If you are an experienced writer

17
Dockerfile-dev Normal file
View File

@@ -0,0 +1,17 @@
FROM ubuntu:trusty
EXPOSE 9991
RUN apt-get update && apt-get install -y wget
RUN localedef -i en_US -f UTF-8 en_US.UTF-8
RUN useradd -d /home/zulip -m zulip && echo 'zulip ALL=(ALL) NOPASSWD:ALL' >> /etc/sudoers
USER zulip
RUN ln -nsf /srv/zulip ~/zulip
RUN echo 'export LC_ALL="en_US.UTF-8" LANG="en_US.UTF-8" LANGUAGE="en_US.UTF-8"' >> ~zulip/.bashrc
RUN echo 'export LC_ALL="en_US.UTF-8" LANG="en_US.UTF-8" LANGUAGE="en_US.UTF-8"' >> ~zulip/.bash_profile
WORKDIR /srv/zulip

View File

@@ -1,15 +1,42 @@
# To build run `docker build -f Dockerfile-postgresql .` from the root of the # To build run `docker build -f Dockerfile-postgresql .` from the root of the
# zulip repo. # zulip repo.
# Currently the PostgreSQL images do not support automatic upgrading of # Install build tools and build tsearch_extras for the current postgres
# version. Currently the postgres images do not support automatic upgrading of
# the on-disk data in volumes. So the base image can not currently be upgraded # the on-disk data in volumes. So the base image can not currently be upgraded
# without users needing a manual pgdump and restore. # without users needing a manual pgdump and restore.
FROM postgres:10
RUN apt-get update \
&& DEBIAN_FRONTEND=noninteractive apt-get install -y \
postgresql-server-dev-$PG_MAJOR \
postgresql-server-dev-all \
git \
build-essential \
fakeroot \
devscripts
RUN git clone https://github.com/zulip/tsearch_extras.git \
&& cd tsearch_extras \
&& echo $PG_MAJOR > debian/pgversions \
&& pg_buildext updatecontrol \
&& debuild -b -uc -us
# Install hunspell, Zulip stop words, and run Zulip database # Install tsearch_extras, hunspell, zulip stop words, and run zulip database
# init. # init.
FROM groonga/pgroonga:latest-alpine-10-slim FROM postgres:10
RUN apk add -U --no-cache hunspell-en ENV TSEARCH_EXTRAS_VERSION=0.4
RUN ln -sf /usr/share/hunspell/en_US.dic /usr/local/share/postgresql/tsearch_data/en_us.dict && ln -sf /usr/share/hunspell/en_US.aff /usr/local/share/postgresql/tsearch_data/en_us.affix ENV TSEARCH_EXTRAS_DEB=postgresql-${PG_MAJOR}-tsearch-extras_${TSEARCH_EXTRAS_VERSION}_amd64.deb
COPY puppet/zulip/files/postgresql/zulip_english.stop /usr/local/share/postgresql/tsearch_data/zulip_english.stop COPY --from=0 /${TSEARCH_EXTRAS_DEB} /tmp
COPY scripts/setup/create-db.sql /docker-entrypoint-initdb.d/zulip-create-db.sql COPY puppet/zulip/files/postgresql/zulip_english.stop /usr/share/postgresql/$PG_MAJOR/tsearch_data/zulip_english.stop
COPY scripts/setup/create-pgroonga.sql /docker-entrypoint-initdb.d/zulip-create-pgroonga.sql COPY scripts/setup/postgres-create-db /docker-entrypoint-initdb.d/postgres-create-db.sh
COPY scripts/setup/pgroonga-debian.asc /tmp
RUN apt-key add /tmp/pgroonga-debian.asc \
&& echo "deb http://packages.groonga.org/debian/ stretch main" > /etc/apt/sources.list.d/zulip.list \
&& apt-get update \
&& DEBIAN_FRONTEND=noninteractive apt-get install --no-install-recommends -y \
hunspell-en-us \
postgresql-${PG_MAJOR}-pgroonga \
&& DEBIAN_FRONTEND=noninteractive dpkg -i /tmp/${TSEARCH_EXTRAS_DEB} \
&& rm /tmp/${TSEARCH_EXTRAS_DEB} \
&& ln -sf /var/cache/postgresql/dicts/en_us.dict "/usr/share/postgresql/$PG_MAJOR/tsearch_data/en_us.dict" \
&& ln -sf /var/cache/postgresql/dicts/en_us.affix "/usr/share/postgresql/$PG_MAJOR/tsearch_data/en_us.affix" \
&& rm -rf /var/lib/apt/lists/*

View File

@@ -1,3 +1,4 @@
Copyright 2011-2018 Dropbox, Inc., Kandra Labs, Inc., and contributors
Apache License Apache License
Version 2.0, January 2004 Version 2.0, January 2004

2
NOTICE
View File

@@ -1,5 +1,3 @@
Copyright 20122015 Dropbox, Inc., 20152021 Kandra Labs, Inc., and contributors
Licensed under the Apache License, Version 2.0 (the "License"); Licensed under the Apache License, Version 2.0 (the "License");
you may not use this project except in compliance with the License. you may not use this project except in compliance with the License.
You may obtain a copy of the License at You may obtain a copy of the License at

View File

@@ -5,19 +5,17 @@ immediacy of real-time chat with the productivity benefits of threaded
conversations. Zulip is used by open source projects, Fortune 500 companies, conversations. Zulip is used by open source projects, Fortune 500 companies,
large standards bodies, and others who need a real-time chat system that large standards bodies, and others who need a real-time chat system that
allows users to easily process hundreds or thousands of messages a day. With allows users to easily process hundreds or thousands of messages a day. With
over 700 contributors merging over 500 commits a month, Zulip is also the over 300 contributors merging over 500 commits a month, Zulip is also the
largest and fastest growing open source group chat project. largest and fastest growing open source group chat project.
[![GitHub Actions build status](https://github.com/zulip/zulip/actions/workflows/zulip-ci.yml/badge.svg)](https://github.com/zulip/zulip/actions/workflows/zulip-ci.yml?query=branch%3Amain) [![CircleCI branch](https://img.shields.io/circleci/project/github/zulip/zulip/master.svg)](https://circleci.com/gh/zulip/zulip)
[![coverage status](https://img.shields.io/codecov/c/github/zulip/zulip/main.svg)](https://codecov.io/gh/zulip/zulip) [![Travis Build Status](https://travis-ci.org/zulip/zulip.svg?branch=master)](https://travis-ci.org/zulip/zulip)
[![Coverage Status](https://img.shields.io/codecov/c/github/zulip/zulip.svg)](https://codecov.io/gh/zulip/zulip)
[![Mypy coverage](https://img.shields.io/badge/mypy-100%25-green.svg)][mypy-coverage] [![Mypy coverage](https://img.shields.io/badge/mypy-100%25-green.svg)][mypy-coverage]
[![code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
[![code style: prettier](https://img.shields.io/badge/code_style-prettier-ff69b4.svg)](https://github.com/prettier/prettier)
[![GitHub release](https://img.shields.io/github/release/zulip/zulip.svg)](https://github.com/zulip/zulip/releases/latest) [![GitHub release](https://img.shields.io/github/release/zulip/zulip.svg)](https://github.com/zulip/zulip/releases/latest)
[![docs](https://readthedocs.org/projects/zulip/badge/?version=latest)](https://zulip.readthedocs.io/en/latest/) [![docs](https://readthedocs.org/projects/zulip/badge/?version=latest)](https://zulip.readthedocs.io/en/latest/)
[![Zulip chat](https://img.shields.io/badge/zulip-join_chat-brightgreen.svg)](https://chat.zulip.org) [![Zulip chat](https://img.shields.io/badge/zulip-join_chat-brightgreen.svg)](https://chat.zulip.org)
[![Twitter](https://img.shields.io/badge/twitter-@zulip-blue.svg?style=flat)](https://twitter.com/zulip) [![Twitter](https://img.shields.io/badge/twitter-@zulip-blue.svg?style=flat)](https://twitter.com/zulip)
[![GitHub Sponsors](https://img.shields.io/github/sponsors/zulip)](https://github.com/sponsors/zulip)
[mypy-coverage]: https://blog.zulip.org/2016/10/13/static-types-in-python-oh-mypy/ [mypy-coverage]: https://blog.zulip.org/2016/10/13/static-types-in-python-oh-mypy/
@@ -30,14 +28,14 @@ and tell us what's up!
You might be interested in: You might be interested in:
- **Contributing code**. Check out our * **Contributing code**. Check out our
[guide for new contributors](https://zulip.readthedocs.io/en/latest/overview/contributing.html) [guide for new contributors](https://zulip.readthedocs.io/en/latest/overview/contributing.html)
to get started. Zulip prides itself on maintaining a clean and to get started. Zulip prides itself on maintaining a clean and
well-tested codebase, and a stock of hundreds of well-tested codebase, and a stock of hundreds of
[beginner-friendly issues][beginner-friendly]. [beginner-friendly issues][beginner-friendly].
- **Contributing non-code**. * **Contributing non-code**.
[Report an issue](https://zulip.readthedocs.io/en/latest/overview/contributing.html#reporting-issues), [Report an issue](https://zulip.readthedocs.io/en/latest/overview/contributing.html#reporting-issue),
[translate](https://zulip.readthedocs.io/en/latest/translating/translating.html) Zulip [translate](https://zulip.readthedocs.io/en/latest/translating/translating.html) Zulip
into your language, into your language,
[write](https://zulip.readthedocs.io/en/latest/overview/contributing.html#zulip-outreach) [write](https://zulip.readthedocs.io/en/latest/overview/contributing.html#zulip-outreach)
@@ -45,36 +43,42 @@ You might be interested in:
[give us feedback](https://zulip.readthedocs.io/en/latest/overview/contributing.html#user-feedback). We [give us feedback](https://zulip.readthedocs.io/en/latest/overview/contributing.html#user-feedback). We
would love to hear from you, even if you're just trying the product out. would love to hear from you, even if you're just trying the product out.
- **Supporting Zulip**. Advocate for your organization to use Zulip, become a [sponsor](https://github.com/sponsors/zulip), write a * **Supporting Zulip**. Advocate for your organization to use Zulip, write a
review in the mobile app stores, or review in the mobile app stores, or
[upvote Zulip](https://zulip.readthedocs.io/en/latest/overview/contributing.html#zulip-outreach) on [upvote Zulip](https://zulip.readthedocs.io/en/latest/overview/contributing.html#zulip-outreach) on
product comparison sites. product comparison sites.
- **Checking Zulip out**. The best way to see Zulip in action is to drop by * **Checking Zulip out**. The best way to see Zulip in action is to drop by
the the
[Zulip community server](https://zulip.readthedocs.io/en/latest/contributing/chat-zulip-org.html). We [Zulip community server](https://zulip.readthedocs.io/en/latest/contributing/chat-zulip-org.html). We
also recommend reading Zulip for also recommend reading Zulip for
[open source](https://zulip.com/for/open-source/), Zulip for [open source](https://zulipchat.com/for/open-source/), Zulip for
[companies](https://zulip.com/for/companies/), or Zulip for [companies](https://zulipchat.com/for/companies/), or Zulip for
[working groups and part time communities](https://zulip.com/for/working-groups-and-communities/). [working groups and part time communities](https://zulipchat.com/for/working-groups-and-communities/).
- **Running a Zulip server**. Use a preconfigured [DigitalOcean droplet](https://marketplace.digitalocean.com/apps/zulip), * **Running a Zulip server**. Setting up a server takes just a couple
[install Zulip](https://zulip.readthedocs.io/en/stable/production/install.html) of minutes. Zulip runs on Ubuntu 18.04 Bionic, Ubuntu 16.04 Xenial,
directly, or use Zulip's Ubuntu 14.04 Trusty, and Debian 9 Stretch. The installation process is
experimental [Docker image](https://zulip.readthedocs.io/en/latest/production/deployment.html#zulip-in-docker). [documented here](https://zulip.readthedocs.io/en/stable/production/install.html).
Commercial support is available; see <https://zulip.com/plans> for details. Commercial support is available; see <https://zulipchat.com/plans>
for details.
- **Using Zulip without setting up a server**. <https://zulip.com> * **Using Zulip without setting up a server**. <https://zulipchat.com> offers
offers free and commercial hosting, including providing our paid free and commercial hosting.
plan for free to fellow open source projects.
- **Participating in [outreach * **Applying for a Zulip internship**. Zulip runs internship programs with
programs](https://zulip.readthedocs.io/en/latest/overview/contributing.html#outreach-programs)** [Outreachy](https://www.outreachy.org/),
like Google Summer of Code. [Google Summer of Code](https://developers.google.com/open-source/gsoc/),
and the
[MIT Externship program](https://alum.mit.edu/students/NetworkwithAlumni/ExternshipProgram). Zulip
also participates in
[Google Code-In](https://developers.google.com/open-source/gci/). More
information is available
[here](https://zulip.readthedocs.io/en/latest/overview/contributing.html#internship-programs).
You may also be interested in reading our [blog](https://blog.zulip.org/) or You may also be interested in reading our [blog](http://blog.zulip.org/) or
following us on [Twitter](https://twitter.com/zulip). following us on [twitter](https://twitter.com/zulip).
Zulip is distributed under the Zulip is distributed under the
[Apache 2.0](https://github.com/zulip/zulip/blob/main/LICENSE) license. [Apache 2.0](https://github.com/zulip/zulip/blob/master/LICENSE) license.
[beginner-friendly]: https://github.com/zulip/zulip/issues?q=is%3Aopen+is%3Aissue+label%3A%22good+first+issue%22 [beginner-friendly]: https://github.com/zulip/zulip/issues?q=is%3Aopen+is%3Aissue+label%3A%22good+first+issue%22

View File

@@ -1,32 +0,0 @@
# Security policy
Security announcements are sent to zulip-announce@googlegroups.com,
so you should subscribe if you are running Zulip in production.
## Reporting a vulnerability
We love responsible reports of (potential) security issues in Zulip,
whether in the latest release or our development branch.
Our security contact is security@zulip.com. Reporters should expect a
response within 24 hours.
Please include details on the issue and how you'd like to be credited
in our release notes when we publish the fix.
Our [security model][security-model] document may be a helpful
resource.
## Supported versions
Zulip provides security support for the latest major release, in the
form of minor security/maintenance releases.
We work hard to make [upgrades][upgrades] reliable, so that there's no
reason to run older major releases.
See also our documentation on the [Zulip release lifecycle][release-lifecycle]
[security-model]: https://zulip.readthedocs.io/en/latest/production/security-model.html
[upgrades]: https://zulip.readthedocs.io/en/latest/production/upgrade-or-modify.html#upgrading-to-a-release
[release-cycle]: https://zulip.readthedocs.io/en/latest/overview/release-lifecycle.html

192
Vagrantfile vendored
View File

@@ -2,18 +2,60 @@
VAGRANTFILE_API_VERSION = "2" VAGRANTFILE_API_VERSION = "2"
if Vagrant::VERSION == "1.8.7" def command?(name)
path = `command -v curl` `which #{name} > /dev/null 2>&1`
if path.include?("/opt/vagrant/embedded/bin/curl") $?.success?
puts "In Vagrant 1.8.7, curl is broken. Please use Vagrant 2.0.2 " \ end
"or run 'sudo rm -f /opt/vagrant/embedded/bin/curl' to fix the " \
"issue before provisioning. See " \ if Vagrant::VERSION == "1.8.7" then
"https://github.com/mitchellh/vagrant/issues/7997 " \ path = `which curl`
if path.include?('/opt/vagrant/embedded/bin/curl') then
puts "In Vagrant 1.8.7, curl is broken. Please use Vagrant 2.0.2 "\
"or run 'sudo rm -f /opt/vagrant/embedded/bin/curl' to fix the "\
"issue before provisioning. See "\
"https://github.com/mitchellh/vagrant/issues/7997 "\
"for reference." "for reference."
exit exit
end end
end end
# Workaround: the lxc-config in vagrant-lxc is incompatible with changes in
# LXC 2.1.0, found in Ubuntu 17.10 artful. LXC 2.1.1 (in 18.04 LTS bionic)
# ignores the old config key, so this will only be needed for artful.
#
# vagrant-lxc upstream has an attempted fix:
# https://github.com/fgrehm/vagrant-lxc/issues/445
# but it didn't work in our testing. This is a temporary issue, so we just
# hack in a fix: we patch the skeleton `lxc-config` file right in the
# distribution of the vagrant-lxc "box" we use. If the user doesn't yet
# have the box (e.g. on first setup), Vagrant would download it but too
# late for us to patch it like this; so we prompt them to explicitly add it
# first and then rerun.
if ['up', 'provision'].include? ARGV[0]
if command? "lxc-ls"
LXC_VERSION = `lxc-ls --version`.strip unless defined? LXC_VERSION
if LXC_VERSION == "2.1.0"
lxc_config_file = ENV['HOME'] + "/.vagrant.d/boxes/fgrehm-VAGRANTSLASH-trusty64-lxc/1.2.0/lxc/lxc-config"
if File.file?(lxc_config_file)
lines = File.readlines(lxc_config_file)
deprecated_line = "lxc.pivotdir = lxc_putold\n"
if lines[1] == deprecated_line
lines[1] = "# #{deprecated_line}"
File.open(lxc_config_file, 'w') do |f|
f.puts(lines)
end
end
else
puts 'You are running LXC 2.1.0, and fgrehm/trusty64-lxc box is incompatible '\
"with it by default. First add the box by doing:\n"\
" vagrant box add https://vagrantcloud.com/fgrehm/trusty64-lxc\n"\
'Once this command succeeds, do "vagrant up" again.'
exit
end
end
end
end
# Workaround: Vagrant removed the atlas.hashicorp.com to # Workaround: Vagrant removed the atlas.hashicorp.com to
# vagrantcloud.com redirect in February 2018. The value of # vagrantcloud.com redirect in February 2018. The value of
# DEFAULT_SERVER_URL in Vagrant versions less than 1.9.3 is # DEFAULT_SERVER_URL in Vagrant versions less than 1.9.3 is
@@ -21,59 +63,40 @@ end
# updating of boxes (since the old URL doesn't work). See # updating of boxes (since the old URL doesn't work). See
# https://github.com/hashicorp/vagrant/issues/9442 # https://github.com/hashicorp/vagrant/issues/9442
if Vagrant::DEFAULT_SERVER_URL == "atlas.hashicorp.com" if Vagrant::DEFAULT_SERVER_URL == "atlas.hashicorp.com"
Vagrant::DEFAULT_SERVER_URL.replace("https://vagrantcloud.com") Vagrant::DEFAULT_SERVER_URL.replace('https://vagrantcloud.com')
end
# Monkey patch https://github.com/hashicorp/vagrant/pull/10879 so we
# can fall back to another provider if docker is not installed.
begin
require Vagrant.source_root.join("plugins", "providers", "docker", "provider")
rescue LoadError
else
VagrantPlugins::DockerProvider::Provider.class_eval do
method(:usable?).owner == singleton_class or def self.usable?(raise_error = false)
VagrantPlugins::DockerProvider::Driver.new.execute("docker", "version")
true
rescue Vagrant::Errors::CommandUnavailable, VagrantPlugins::DockerProvider::Errors::ExecuteError
raise if raise_error
return false
end
end
end end
Vagrant.configure(VAGRANTFILE_API_VERSION) do |config| Vagrant.configure(VAGRANTFILE_API_VERSION) do |config|
# For LXC. VirtualBox hosts use a different box, described below.
config.vm.box = "fgrehm/trusty64-lxc"
# The Zulip development environment runs on 9991 on the guest. # The Zulip development environment runs on 9991 on the guest.
host_port = 9991 host_port = 9991
http_proxy = https_proxy = no_proxy = nil http_proxy = https_proxy = no_proxy = nil
host_ip_addr = "127.0.0.1" host_ip_addr = "127.0.0.1"
# System settings for the virtual machine.
vm_num_cpus = "2"
vm_memory = "2048"
ubuntu_mirror = ""
vboxadd_version = nil
config.vm.synced_folder ".", "/vagrant", disabled: true config.vm.synced_folder ".", "/vagrant", disabled: true
if (/darwin/ =~ RUBY_PLATFORM) != nil
config.vm.synced_folder ".", "/srv/zulip", type: "nfs",
linux__nfs_options: ['rw']
config.vm.network "private_network", type: "dhcp"
else
config.vm.synced_folder ".", "/srv/zulip" config.vm.synced_folder ".", "/srv/zulip"
end
vagrant_config_file = ENV["HOME"] + "/.zulip-vagrant-config" vagrant_config_file = ENV['HOME'] + "/.zulip-vagrant-config"
if File.file?(vagrant_config_file) if File.file?(vagrant_config_file)
IO.foreach(vagrant_config_file) do |line| IO.foreach(vagrant_config_file) do |line|
line.chomp! line.chomp!
key, value = line.split(nil, 2) key, value = line.split(nil, 2)
case key case key
when /^([#;]|$)/ # ignore comments when /^([#;]|$)/; # ignore comments
when "HTTP_PROXY"; http_proxy = value when "HTTP_PROXY"; http_proxy = value
when "HTTPS_PROXY"; https_proxy = value when "HTTPS_PROXY"; https_proxy = value
when "NO_PROXY"; no_proxy = value when "NO_PROXY"; no_proxy = value
when "HOST_PORT"; host_port = value.to_i when "HOST_PORT"; host_port = value.to_i
when "HOST_IP_ADDR"; host_ip_addr = value when "HOST_IP_ADDR"; host_ip_addr = value
when "GUEST_CPUS"; vm_num_cpus = value
when "GUEST_MEMORY_MB"; vm_memory = value
when "UBUNTU_MIRROR"; ubuntu_mirror = value
when "VBOXADD_VERSION"; vboxadd_version = value
end end
end end
end end
@@ -91,57 +114,46 @@ Vagrant.configure(VAGRANTFILE_API_VERSION) do |config|
elsif !http_proxy.nil? or !https_proxy.nil? elsif !http_proxy.nil? or !https_proxy.nil?
# This prints twice due to https://github.com/hashicorp/vagrant/issues/7504 # This prints twice due to https://github.com/hashicorp/vagrant/issues/7504
# We haven't figured out a workaround. # We haven't figured out a workaround.
puts "You have specified value for proxy in ~/.zulip-vagrant-config file but did not " \ puts 'You have specified value for proxy in ~/.zulip-vagrant-config file but did not ' \
"install the vagrant-proxyconf plugin. To install it, run `vagrant plugin install " \ 'install the vagrant-proxyconf plugin. To install it, run `vagrant plugin install ' \
"vagrant-proxyconf` in a terminal. This error will appear twice." 'vagrant-proxyconf` in a terminal. This error will appear twice.'
exit exit
end end
config.vm.network "forwarded_port", guest: 9991, host: host_port, host_ip: host_ip_addr config.vm.network "forwarded_port", guest: 9991, host: host_port, host_ip: host_ip_addr
config.vm.network "forwarded_port", guest: 9994, host: host_port + 3, host_ip: host_ip_addr config.vm.network "forwarded_port", guest: 9994, host: host_port + 3, host_ip: host_ip_addr
# Specify Docker provider before VirtualBox provider so it's preferred. # Specify LXC provider before VirtualBox provider so it's preferred.
config.vm.provider "docker" do |d, override| config.vm.provider "lxc" do |lxc|
d.build_dir = File.join(__dir__, "tools", "setup", "dev-vagrant-docker") if command? "lxc-ls"
d.build_args = ["--build-arg", "VAGRANT_UID=#{Process.uid}"] LXC_VERSION = `lxc-ls --version`.strip unless defined? LXC_VERSION
if !ubuntu_mirror.empty? if LXC_VERSION >= "1.1.0" and LXC_VERSION < "3.0.0"
d.build_args += ["--build-arg", "UBUNTU_MIRROR=#{ubuntu_mirror}"] # Allow start without AppArmor, otherwise Box will not Start on Ubuntu 14.10
# see https://github.com/fgrehm/vagrant-lxc/issues/333
lxc.customize 'aa_allow_incomplete', 1
end
if LXC_VERSION >= "3.0.0"
lxc.customize 'apparmor.allow_incomplete', 1
end
if LXC_VERSION >= "2.0.0"
lxc.backingstore = 'dir'
end
end end
d.has_ssh = true
d.create_args = ["--ulimit", "nofile=1024:65536"]
end end
config.vm.provider "virtualbox" do |vb, override| config.vm.provider "virtualbox" do |vb, override|
override.vm.box = "hashicorp/bionic64" override.vm.box = "ubuntu/trusty64"
# It's possible we can get away with just 1.5GB; more testing needed # It's possible we can get away with just 1.5GB; more testing needed
vb.memory = vm_memory vb.memory = 2048
vb.cpus = vm_num_cpus vb.cpus = 2
if !vboxadd_version.nil?
override.vbguest.installer = Class.new(VagrantVbguest::Installers::Ubuntu) do
define_method(:host_version) do |reload = false|
VagrantVbguest::Version(vboxadd_version)
end
end
override.vbguest.allow_downgrade = true
override.vbguest.iso_path = "https://download.virtualbox.org/virtualbox/#{vboxadd_version}/VBoxGuestAdditions_#{vboxadd_version}.iso"
end
end end
config.vm.provider "hyperv" do |h, override| config.vm.provider "vmware_fusion" do |vb, override|
override.vm.box = "bento/ubuntu-18.04" override.vm.box = "puphpet/ubuntu1404-x64"
h.memory = vm_memory vb.vmx["memsize"] = "2048"
h.maxmemory = vm_memory vb.vmx["numvcpus"] = "2"
h.cpus = vm_num_cpus
end end
config.vm.provider "parallels" do |prl, override| $provision_script = <<SCRIPT
override.vm.box = "bento/ubuntu-18.04"
override.vm.box_version = "202005.21.0"
prl.memory = vm_memory
prl.cpus = vm_num_cpus
end
$provision_script = <<SCRIPT
set -x set -x
set -e set -e
set -o pipefail set -o pipefail
@@ -150,19 +162,29 @@ set -o pipefail
# something that we don't want to happen when running provision in a # something that we don't want to happen when running provision in a
# development environment not using Vagrant. # development environment not using Vagrant.
# Set the Ubuntu mirror
[ ! '#{ubuntu_mirror}' ] || sudo sed -i 's|http://\\(\\w*\\.\\)*archive\\.ubuntu\\.com/ubuntu/\\? |#{ubuntu_mirror} |' /etc/apt/sources.list
# Set the MOTD on the system to have Zulip instructions # Set the MOTD on the system to have Zulip instructions
sudo ln -nsf /srv/zulip/tools/setup/dev-motd /etc/update-motd.d/99-zulip-dev sudo rm -f /etc/update-motd.d/*
sudo rm -f /etc/update-motd.d/10-help-text sudo bash -c 'cat << EndOfMessage > /etc/motd
sudo dpkg --purge landscape-client landscape-common ubuntu-release-upgrader-core update-manager-core update-notifier-common ubuntu-server Welcome to the Zulip development environment! Popular commands:
sudo dpkg-divert --add --rename /etc/default/motd-news * tools/provision - Update the development environment
sudo sh -c 'echo ENABLED=0 > /etc/default/motd-news' * tools/run-dev.py - Run the development server
* tools/lint - Run the linter (quick and catches many problems)
* tools/test-* - Run tests (use --help to learn about options)
Read https://zulip.readthedocs.io/en/latest/testing/testing.html to learn
how to run individual test suites so that you can get a fast debug cycle.
EndOfMessage'
# If the host is running SELinux remount the /sys/fs/selinux directory as read only,
# needed for apt-get to work.
if [ -d "/sys/fs/selinux" ]; then
sudo mount -o remount,ro /sys/fs/selinux
fi
# Set default locale, this prevents errors if the user has another locale set. # Set default locale, this prevents errors if the user has another locale set.
if ! grep -q 'LC_ALL=C.UTF-8' /etc/default/locale; then if ! grep -q 'LC_ALL=en_US.UTF-8' /etc/default/locale; then
echo "LC_ALL=C.UTF-8" | sudo tee -a /etc/default/locale echo "LC_ALL=en_US.UTF-8" | sudo tee -a /etc/default/locale
fi fi
# Set an environment variable, so that we won't print the virtualenv # Set an environment variable, so that we won't print the virtualenv

View File

@@ -1,198 +1,140 @@
import logging
import time import time
from collections import OrderedDict, defaultdict from collections import OrderedDict, defaultdict
from datetime import datetime, timedelta from datetime import datetime, timedelta
from typing import Callable, Dict, Optional, Sequence, Tuple, Type, Union import logging
from typing import Callable, Dict, List, \
Optional, Tuple, Type, Union
from django.conf import settings from django.conf import settings
from django.db import connection from django.db import connection
from django.db.models import F from django.db.models import F
from psycopg2.sql import SQL, Composable, Identifier, Literal
from analytics.models import ( from analytics.models import BaseCount, \
BaseCount, FillState, InstallationCount, RealmCount, StreamCount, \
FillState, UserCount, installation_epoch, last_successful_fill
InstallationCount,
RealmCount,
StreamCount,
UserCount,
installation_epoch,
)
from zerver.lib.logging_util import log_to_file from zerver.lib.logging_util import log_to_file
from zerver.lib.timestamp import ceiling_to_day, ceiling_to_hour, floor_to_hour, verify_UTC from zerver.lib.timestamp import ceiling_to_day, \
from zerver.models import ( ceiling_to_hour, floor_to_hour, verify_UTC
Message, from zerver.models import Message, Realm, \
Realm, Stream, UserActivityInterval, UserProfile, models
RealmAuditLog,
Stream,
UserActivityInterval,
UserProfile,
models,
)
## Logging setup ## ## Logging setup ##
logger = logging.getLogger("zulip.management") logger = logging.getLogger('zulip.management')
log_to_file(logger, settings.ANALYTICS_LOG_PATH) log_to_file(logger, settings.ANALYTICS_LOG_PATH)
# You can't subtract timedelta.max from a datetime, so use this instead # You can't subtract timedelta.max from a datetime, so use this instead
TIMEDELTA_MAX = timedelta(days=365 * 1000) TIMEDELTA_MAX = timedelta(days=365*1000)
## Class definitions ## ## Class definitions ##
class CountStat: class CountStat:
HOUR = "hour" HOUR = 'hour'
DAY = "day" DAY = 'day'
FREQUENCIES = frozenset([HOUR, DAY]) FREQUENCIES = frozenset([HOUR, DAY])
@property def __init__(self, property: str, data_collector: 'DataCollector', frequency: str,
def time_increment(self) -> timedelta: interval: Optional[timedelta]=None) -> None:
if self.frequency == CountStat.HOUR:
return timedelta(hours=1)
return timedelta(days=1)
def __init__(
self,
property: str,
data_collector: "DataCollector",
frequency: str,
interval: Optional[timedelta] = None,
) -> None:
self.property = property self.property = property
self.data_collector = data_collector self.data_collector = data_collector
# might have to do something different for bitfields # might have to do something different for bitfields
if frequency not in self.FREQUENCIES: if frequency not in self.FREQUENCIES:
raise AssertionError(f"Unknown frequency: {frequency}") raise AssertionError("Unknown frequency: %s" % (frequency,))
self.frequency = frequency self.frequency = frequency
if interval is not None: if interval is not None:
self.interval = interval self.interval = interval
else: elif frequency == CountStat.HOUR:
self.interval = self.time_increment self.interval = timedelta(hours=1)
else: # frequency == CountStat.DAY
self.interval = timedelta(days=1)
def __str__(self) -> str: def __str__(self) -> str:
return f"<CountStat: {self.property}>" return "<CountStat: %s>" % (self.property,)
def last_successful_fill(self) -> Optional[datetime]:
fillstate = FillState.objects.filter(property=self.property).first()
if fillstate is None:
return None
if fillstate.state == FillState.DONE:
return fillstate.end_time
return fillstate.end_time - self.time_increment
class LoggingCountStat(CountStat): class LoggingCountStat(CountStat):
def __init__(self, property: str, output_table: Type[BaseCount], frequency: str) -> None: def __init__(self, property: str, output_table: Type[BaseCount], frequency: str) -> None:
CountStat.__init__(self, property, DataCollector(output_table, None), frequency) CountStat.__init__(self, property, DataCollector(output_table, None), frequency)
class DependentCountStat(CountStat): class DependentCountStat(CountStat):
def __init__( def __init__(self, property: str, data_collector: 'DataCollector', frequency: str,
self, interval: Optional[timedelta]=None, dependencies: List[str]=[]) -> None:
property: str,
data_collector: "DataCollector",
frequency: str,
interval: Optional[timedelta] = None,
dependencies: Sequence[str] = [],
) -> None:
CountStat.__init__(self, property, data_collector, frequency, interval=interval) CountStat.__init__(self, property, data_collector, frequency, interval=interval)
self.dependencies = dependencies self.dependencies = dependencies
class DataCollector: class DataCollector:
def __init__( def __init__(self, output_table: Type[BaseCount],
self, pull_function: Optional[Callable[[str, datetime, datetime], int]]) -> None:
output_table: Type[BaseCount],
pull_function: Optional[Callable[[str, datetime, datetime, Optional[Realm]], int]],
) -> None:
self.output_table = output_table self.output_table = output_table
self.pull_function = pull_function self.pull_function = pull_function
## CountStat-level operations ## ## CountStat-level operations ##
def process_count_stat(stat: CountStat, fill_to_time: datetime) -> None:
if stat.frequency == CountStat.HOUR:
time_increment = timedelta(hours=1)
elif stat.frequency == CountStat.DAY:
time_increment = timedelta(days=1)
else:
raise AssertionError("Unknown frequency: %s" % (stat.frequency,))
def process_count_stat(
stat: CountStat, fill_to_time: datetime, realm: Optional[Realm] = None
) -> None:
# TODO: The realm argument is not yet supported, in that we don't
# have a solution for how to update FillState if it is passed. It
# exists solely as partial plumbing for when we do fully implement
# doing single-realm analytics runs for use cases like data import.
#
# Also, note that for the realm argument to be properly supported,
# the CountStat object passed in needs to have come from
# E.g. get_count_stats(realm), i.e. have the realm_id already
# entered into the SQL query defined by the CountState object.
verify_UTC(fill_to_time) verify_UTC(fill_to_time)
if floor_to_hour(fill_to_time) != fill_to_time: if floor_to_hour(fill_to_time) != fill_to_time:
raise ValueError(f"fill_to_time must be on an hour boundary: {fill_to_time}") raise ValueError("fill_to_time must be on an hour boundary: %s" % (fill_to_time,))
fill_state = FillState.objects.filter(property=stat.property).first() fill_state = FillState.objects.filter(property=stat.property).first()
if fill_state is None: if fill_state is None:
currently_filled = installation_epoch() currently_filled = installation_epoch()
fill_state = FillState.objects.create( fill_state = FillState.objects.create(property=stat.property,
property=stat.property, end_time=currently_filled, state=FillState.DONE end_time=currently_filled,
) state=FillState.DONE)
logger.info("INITIALIZED %s %s", stat.property, currently_filled) logger.info("INITIALIZED %s %s" % (stat.property, currently_filled))
elif fill_state.state == FillState.STARTED: elif fill_state.state == FillState.STARTED:
logger.info("UNDO START %s %s", stat.property, fill_state.end_time) logger.info("UNDO START %s %s" % (stat.property, fill_state.end_time))
do_delete_counts_at_hour(stat, fill_state.end_time) do_delete_counts_at_hour(stat, fill_state.end_time)
currently_filled = fill_state.end_time - stat.time_increment currently_filled = fill_state.end_time - time_increment
do_update_fill_state(fill_state, currently_filled, FillState.DONE) do_update_fill_state(fill_state, currently_filled, FillState.DONE)
logger.info("UNDO DONE %s", stat.property) logger.info("UNDO DONE %s" % (stat.property,))
elif fill_state.state == FillState.DONE: elif fill_state.state == FillState.DONE:
currently_filled = fill_state.end_time currently_filled = fill_state.end_time
else: else:
raise AssertionError(f"Unknown value for FillState.state: {fill_state.state}.") raise AssertionError("Unknown value for FillState.state: %s." % (fill_state.state,))
if isinstance(stat, DependentCountStat): if isinstance(stat, DependentCountStat):
for dependency in stat.dependencies: for dependency in stat.dependencies:
dependency_fill_time = COUNT_STATS[dependency].last_successful_fill() dependency_fill_time = last_successful_fill(dependency)
if dependency_fill_time is None: if dependency_fill_time is None:
logger.warning( logger.warning("DependentCountStat %s run before dependency %s." %
"DependentCountStat %s run before dependency %s.", stat.property, dependency (stat.property, dependency))
)
return return
fill_to_time = min(fill_to_time, dependency_fill_time) fill_to_time = min(fill_to_time, dependency_fill_time)
currently_filled = currently_filled + stat.time_increment currently_filled = currently_filled + time_increment
while currently_filled <= fill_to_time: while currently_filled <= fill_to_time:
logger.info("START %s %s", stat.property, currently_filled) logger.info("START %s %s" % (stat.property, currently_filled))
start = time.time() start = time.time()
do_update_fill_state(fill_state, currently_filled, FillState.STARTED) do_update_fill_state(fill_state, currently_filled, FillState.STARTED)
do_fill_count_stat_at_hour(stat, currently_filled, realm) do_fill_count_stat_at_hour(stat, currently_filled)
do_update_fill_state(fill_state, currently_filled, FillState.DONE) do_update_fill_state(fill_state, currently_filled, FillState.DONE)
end = time.time() end = time.time()
currently_filled = currently_filled + stat.time_increment currently_filled = currently_filled + time_increment
logger.info("DONE %s (%dms)", stat.property, (end - start) * 1000) logger.info("DONE %s (%dms)" % (stat.property, (end-start)*1000))
def do_update_fill_state(fill_state: FillState, end_time: datetime, state: int) -> None: def do_update_fill_state(fill_state: FillState, end_time: datetime, state: int) -> None:
fill_state.end_time = end_time fill_state.end_time = end_time
fill_state.state = state fill_state.state = state
fill_state.save() fill_state.save()
# We assume end_time is valid (e.g. is on a day or hour boundary as appropriate) # We assume end_time is valid (e.g. is on a day or hour boundary as appropriate)
# and is timezone aware. It is the caller's responsibility to enforce this! # and is timezone aware. It is the caller's responsibility to enforce this!
def do_fill_count_stat_at_hour( def do_fill_count_stat_at_hour(stat: CountStat, end_time: datetime) -> None:
stat: CountStat, end_time: datetime, realm: Optional[Realm] = None
) -> None:
start_time = end_time - stat.interval start_time = end_time - stat.interval
if not isinstance(stat, LoggingCountStat): if not isinstance(stat, LoggingCountStat):
timer = time.time() timer = time.time()
assert stat.data_collector.pull_function is not None assert(stat.data_collector.pull_function is not None)
rows_added = stat.data_collector.pull_function(stat.property, start_time, end_time, realm) rows_added = stat.data_collector.pull_function(stat.property, start_time, end_time)
logger.info( logger.info("%s run pull_function (%dms/%sr)" %
"%s run pull_function (%dms/%sr)", (stat.property, (time.time()-timer)*1000, rows_added))
stat.property, do_aggregate_to_summary_table(stat, end_time)
(time.time() - timer) * 1000,
rows_added,
)
do_aggregate_to_summary_table(stat, end_time, realm)
def do_delete_counts_at_hour(stat: CountStat, end_time: datetime) -> None: def do_delete_counts_at_hour(stat: CountStat, end_time: datetime) -> None:
if isinstance(stat, LoggingCountStat): if isinstance(stat, LoggingCountStat):
@@ -205,115 +147,66 @@ def do_delete_counts_at_hour(stat: CountStat, end_time: datetime) -> None:
RealmCount.objects.filter(property=stat.property, end_time=end_time).delete() RealmCount.objects.filter(property=stat.property, end_time=end_time).delete()
InstallationCount.objects.filter(property=stat.property, end_time=end_time).delete() InstallationCount.objects.filter(property=stat.property, end_time=end_time).delete()
def do_aggregate_to_summary_table(stat: CountStat, end_time: datetime) -> None:
def do_aggregate_to_summary_table(
stat: CountStat, end_time: datetime, realm: Optional[Realm] = None
) -> None:
cursor = connection.cursor() cursor = connection.cursor()
# Aggregate into RealmCount # Aggregate into RealmCount
output_table = stat.data_collector.output_table output_table = stat.data_collector.output_table
if realm is not None:
realm_clause = SQL("AND zerver_realm.id = {}").format(Literal(realm.id))
else:
realm_clause = SQL("")
if output_table in (UserCount, StreamCount): if output_table in (UserCount, StreamCount):
realmcount_query = SQL( realmcount_query = """
"""
INSERT INTO analytics_realmcount INSERT INTO analytics_realmcount
(realm_id, value, property, subgroup, end_time) (realm_id, value, property, subgroup, end_time)
SELECT SELECT
zerver_realm.id, COALESCE(sum({output_table}.value), 0), %(property)s, zerver_realm.id, COALESCE(sum(%(output_table)s.value), 0), '%(property)s',
{output_table}.subgroup, %(end_time)s %(output_table)s.subgroup, %%(end_time)s
FROM zerver_realm FROM zerver_realm
JOIN {output_table} JOIN %(output_table)s
ON ON
zerver_realm.id = {output_table}.realm_id zerver_realm.id = %(output_table)s.realm_id
WHERE WHERE
{output_table}.property = %(property)s AND %(output_table)s.property = '%(property)s' AND
{output_table}.end_time = %(end_time)s %(output_table)s.end_time = %%(end_time)s
{realm_clause} GROUP BY zerver_realm.id, %(output_table)s.subgroup
GROUP BY zerver_realm.id, {output_table}.subgroup """ % {'output_table': output_table._meta.db_table,
""" 'property': stat.property}
).format(
output_table=Identifier(output_table._meta.db_table),
realm_clause=realm_clause,
)
start = time.time() start = time.time()
cursor.execute( cursor.execute(realmcount_query, {'end_time': end_time})
realmcount_query,
{
"property": stat.property,
"end_time": end_time,
},
)
end = time.time() end = time.time()
logger.info( logger.info("%s RealmCount aggregation (%dms/%sr)" % (
"%s RealmCount aggregation (%dms/%sr)", stat.property, (end - start) * 1000, cursor.rowcount))
stat.property,
(end - start) * 1000,
cursor.rowcount,
)
if realm is None: # Aggregate into InstallationCount
# Aggregate into InstallationCount. Only run if we just installationcount_query = """
# processed counts for all realms.
#
# TODO: Add support for updating installation data after
# changing an individual realm's values.
installationcount_query = SQL(
"""
INSERT INTO analytics_installationcount INSERT INTO analytics_installationcount
(value, property, subgroup, end_time) (value, property, subgroup, end_time)
SELECT SELECT
sum(value), %(property)s, analytics_realmcount.subgroup, %(end_time)s sum(value), '%(property)s', analytics_realmcount.subgroup, %%(end_time)s
FROM analytics_realmcount FROM analytics_realmcount
WHERE WHERE
property = %(property)s AND property = '%(property)s' AND
end_time = %(end_time)s end_time = %%(end_time)s
GROUP BY analytics_realmcount.subgroup GROUP BY analytics_realmcount.subgroup
""" """ % {'property': stat.property}
)
start = time.time() start = time.time()
cursor.execute( cursor.execute(installationcount_query, {'end_time': end_time})
installationcount_query,
{
"property": stat.property,
"end_time": end_time,
},
)
end = time.time() end = time.time()
logger.info( logger.info("%s InstallationCount aggregation (%dms/%sr)" % (
"%s InstallationCount aggregation (%dms/%sr)", stat.property, (end - start) * 1000, cursor.rowcount))
stat.property,
(end - start) * 1000,
cursor.rowcount,
)
cursor.close() cursor.close()
## Utility functions called from outside counts.py ## ## Utility functions called from outside counts.py ##
# called from zerver/lib/actions.py; should not throw any errors # called from zerver/lib/actions.py; should not throw any errors
def do_increment_logging_stat( def do_increment_logging_stat(zerver_object: Union[Realm, UserProfile, Stream], stat: CountStat,
zerver_object: Union[Realm, UserProfile, Stream], subgroup: Optional[Union[str, int, bool]], event_time: datetime,
stat: CountStat, increment: int=1) -> None:
subgroup: Optional[Union[str, int, bool]],
event_time: datetime,
increment: int = 1,
) -> None:
if not increment:
return
table = stat.data_collector.output_table table = stat.data_collector.output_table
if table == RealmCount: if table == RealmCount:
id_args = {"realm": zerver_object} id_args = {'realm': zerver_object}
elif table == UserCount: elif table == UserCount:
id_args = {"realm": zerver_object.realm, "user": zerver_object} id_args = {'realm': zerver_object.realm, 'user': zerver_object}
else: # StreamCount else: # StreamCount
id_args = {"realm": zerver_object.realm, "stream": zerver_object} id_args = {'realm': zerver_object.realm, 'stream': zerver_object}
if stat.frequency == CountStat.DAY: if stat.frequency == CountStat.DAY:
end_time = ceiling_to_day(event_time) end_time = ceiling_to_day(event_time)
@@ -321,16 +214,11 @@ def do_increment_logging_stat(
end_time = ceiling_to_hour(event_time) end_time = ceiling_to_hour(event_time)
row, created = table.objects.get_or_create( row, created = table.objects.get_or_create(
property=stat.property, property=stat.property, subgroup=subgroup, end_time=end_time,
subgroup=subgroup, defaults={'value': increment}, **id_args)
end_time=end_time,
defaults={"value": increment},
**id_args,
)
if not created: if not created:
row.value = F("value") + increment row.value = F('value') + increment
row.save(update_fields=["value"]) row.save(update_fields=['value'])
def do_drop_all_analytics_tables() -> None: def do_drop_all_analytics_tables() -> None:
UserCount.objects.all().delete() UserCount.objects.all().delete()
@@ -339,7 +227,6 @@ def do_drop_all_analytics_tables() -> None:
InstallationCount.objects.all().delete() InstallationCount.objects.all().delete()
FillState.objects.all().delete() FillState.objects.all().delete()
def do_drop_single_stat(property: str) -> None: def do_drop_single_stat(property: str) -> None:
UserCount.objects.filter(property=property).delete() UserCount.objects.filter(property=property).delete()
StreamCount.objects.filter(property=property).delete() StreamCount.objects.filter(property=property).delete()
@@ -347,142 +234,77 @@ def do_drop_single_stat(property: str) -> None:
InstallationCount.objects.filter(property=property).delete() InstallationCount.objects.filter(property=property).delete()
FillState.objects.filter(property=property).delete() FillState.objects.filter(property=property).delete()
## DataCollector-level operations ## ## DataCollector-level operations ##
QueryFn = Callable[[Dict[str, Composable]], Composable] def do_pull_by_sql_query(property: str, start_time: datetime, end_time: datetime, query: str,
group_by: Optional[Tuple[models.Model, str]]) -> int:
def do_pull_by_sql_query(
property: str,
start_time: datetime,
end_time: datetime,
query: QueryFn,
group_by: Optional[Tuple[models.Model, str]],
) -> int:
if group_by is None: if group_by is None:
subgroup = SQL("NULL") subgroup = 'NULL'
group_by_clause = SQL("") group_by_clause = ''
else: else:
subgroup = Identifier(group_by[0]._meta.db_table, group_by[1]) subgroup = '%s.%s' % (group_by[0]._meta.db_table, group_by[1])
group_by_clause = SQL(", {}").format(subgroup) group_by_clause = ', ' + subgroup
# We do string replacement here because cursor.execute will reject a # We do string replacement here because cursor.execute will reject a
# group_by_clause given as a param. # group_by_clause given as a param.
# We pass in the datetimes as params to cursor.execute so that we don't have to # We pass in the datetimes as params to cursor.execute so that we don't have to
# think about how to convert python datetimes to SQL datetimes. # think about how to convert python datetimes to SQL datetimes.
query_ = query( query_ = query % {'property': property, 'subgroup': subgroup,
{ 'group_by_clause': group_by_clause}
"subgroup": subgroup,
"group_by_clause": group_by_clause,
}
)
cursor = connection.cursor() cursor = connection.cursor()
cursor.execute( cursor.execute(query_, {'time_start': start_time, 'time_end': end_time})
query_,
{
"property": property,
"time_start": start_time,
"time_end": end_time,
},
)
rowcount = cursor.rowcount rowcount = cursor.rowcount
cursor.close() cursor.close()
return rowcount return rowcount
def sql_data_collector(output_table: Type[BaseCount], query: str,
def sql_data_collector( group_by: Optional[Tuple[models.Model, str]]) -> DataCollector:
output_table: Type[BaseCount], def pull_function(property: str, start_time: datetime, end_time: datetime) -> int:
query: QueryFn,
group_by: Optional[Tuple[models.Model, str]],
) -> DataCollector:
def pull_function(
property: str, start_time: datetime, end_time: datetime, realm: Optional[Realm] = None
) -> int:
# The pull function type needs to accept a Realm argument
# because the 'minutes_active::day' CountStat uses
# DataCollector directly for do_pull_minutes_active, which
# requires the realm argument. We ignore it here, because the
# realm should have been already encoded in the `query` we're
# passed.
return do_pull_by_sql_query(property, start_time, end_time, query, group_by) return do_pull_by_sql_query(property, start_time, end_time, query, group_by)
return DataCollector(output_table, pull_function) return DataCollector(output_table, pull_function)
def do_pull_minutes_active(property: str, start_time: datetime, end_time: datetime) -> int:
user_activity_intervals = UserActivityInterval.objects.filter(
end__gt=start_time, start__lt=end_time
).select_related(
'user_profile'
).values_list(
'user_profile_id', 'user_profile__realm_id', 'start', 'end')
def do_pull_minutes_active( seconds_active = defaultdict(float) # type: Dict[Tuple[int, int], float]
property: str, start_time: datetime, end_time: datetime, realm: Optional[Realm] = None
) -> int:
user_activity_intervals = (
UserActivityInterval.objects.filter(
end__gt=start_time,
start__lt=end_time,
)
.select_related(
"user_profile",
)
.values_list("user_profile_id", "user_profile__realm_id", "start", "end")
)
seconds_active: Dict[Tuple[int, int], float] = defaultdict(float)
for user_id, realm_id, interval_start, interval_end in user_activity_intervals: for user_id, realm_id, interval_start, interval_end in user_activity_intervals:
if realm is None or realm.id == realm_id:
start = max(start_time, interval_start) start = max(start_time, interval_start)
end = min(end_time, interval_end) end = min(end_time, interval_end)
seconds_active[(user_id, realm_id)] += (end - start).total_seconds() seconds_active[(user_id, realm_id)] += (end - start).total_seconds()
rows = [ rows = [UserCount(user_id=ids[0], realm_id=ids[1], property=property,
UserCount( end_time=end_time, value=int(seconds // 60))
user_id=ids[0], for ids, seconds in seconds_active.items() if seconds >= 60]
realm_id=ids[1],
property=property,
end_time=end_time,
value=int(seconds // 60),
)
for ids, seconds in seconds_active.items()
if seconds >= 60
]
UserCount.objects.bulk_create(rows) UserCount.objects.bulk_create(rows)
return len(rows) return len(rows)
count_message_by_user_query = """
def count_message_by_user_query(realm: Optional[Realm]) -> QueryFn:
if realm is None:
realm_clause = SQL("")
else:
realm_clause = SQL("zerver_userprofile.realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL(
"""
INSERT INTO analytics_usercount INSERT INTO analytics_usercount
(user_id, realm_id, value, property, subgroup, end_time) (user_id, realm_id, value, property, subgroup, end_time)
SELECT SELECT
zerver_userprofile.id, zerver_userprofile.realm_id, count(*), zerver_userprofile.id, zerver_userprofile.realm_id, count(*),
%(property)s, {subgroup}, %(time_end)s '%(property)s', %(subgroup)s, %%(time_end)s
FROM zerver_userprofile FROM zerver_userprofile
JOIN zerver_message JOIN zerver_message
ON ON
zerver_userprofile.id = zerver_message.sender_id zerver_userprofile.id = zerver_message.sender_id
WHERE WHERE
zerver_userprofile.date_joined < %(time_end)s AND zerver_userprofile.date_joined < %%(time_end)s AND
zerver_message.date_sent >= %(time_start)s AND zerver_message.pub_date >= %%(time_start)s AND
{realm_clause} zerver_message.pub_date < %%(time_end)s
zerver_message.date_sent < %(time_end)s GROUP BY zerver_userprofile.id %(group_by_clause)s
GROUP BY zerver_userprofile.id {group_by_clause}
""" """
).format(**kwargs, realm_clause=realm_clause)
# Note: ignores the group_by / group_by_clause. # Note: ignores the group_by / group_by_clause.
def count_message_type_by_user_query(realm: Optional[Realm]) -> QueryFn: count_message_type_by_user_query = """
if realm is None:
realm_clause = SQL("")
else:
realm_clause = SQL("zerver_userprofile.realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL(
"""
INSERT INTO analytics_usercount INSERT INTO analytics_usercount
(realm_id, user_id, value, property, subgroup, end_time) (realm_id, user_id, value, property, subgroup, end_time)
SELECT realm_id, id, SUM(count) AS value, %(property)s, message_type, %(time_end)s SELECT realm_id, id, SUM(count) AS value, '%(property)s', message_type, %%(time_end)s
FROM FROM
( (
SELECT zerver_userprofile.realm_id, zerver_userprofile.id, count(*), SELECT zerver_userprofile.realm_id, zerver_userprofile.id, count(*),
@@ -500,9 +322,8 @@ def count_message_type_by_user_query(realm: Optional[Realm]) -> QueryFn:
JOIN zerver_message JOIN zerver_message
ON ON
zerver_userprofile.id = zerver_message.sender_id AND zerver_userprofile.id = zerver_message.sender_id AND
zerver_message.date_sent >= %(time_start)s AND zerver_message.pub_date >= %%(time_start)s AND
{realm_clause} zerver_message.pub_date < %%(time_end)s
zerver_message.date_sent < %(time_end)s
JOIN zerver_recipient JOIN zerver_recipient
ON ON
zerver_message.recipient_id = zerver_recipient.id zerver_message.recipient_id = zerver_recipient.id
@@ -515,24 +336,16 @@ def count_message_type_by_user_query(realm: Optional[Realm]) -> QueryFn:
) AS subquery ) AS subquery
GROUP BY realm_id, id, message_type GROUP BY realm_id, id, message_type
""" """
).format(**kwargs, realm_clause=realm_clause)
# This query joins to the UserProfile table since all current queries that # This query joins to the UserProfile table since all current queries that
# use this also subgroup on UserProfile.is_bot. If in the future there is a # use this also subgroup on UserProfile.is_bot. If in the future there is a
# stat that counts messages by stream and doesn't need the UserProfile # stat that counts messages by stream and doesn't need the UserProfile
# table, consider writing a new query for efficiency. # table, consider writing a new query for efficiency.
def count_message_by_stream_query(realm: Optional[Realm]) -> QueryFn: count_message_by_stream_query = """
if realm is None:
realm_clause = SQL("")
else:
realm_clause = SQL("zerver_stream.realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL(
"""
INSERT INTO analytics_streamcount INSERT INTO analytics_streamcount
(stream_id, realm_id, value, property, subgroup, end_time) (stream_id, realm_id, value, property, subgroup, end_time)
SELECT SELECT
zerver_stream.id, zerver_stream.realm_id, count(*), %(property)s, {subgroup}, %(time_end)s zerver_stream.id, zerver_stream.realm_id, count(*), '%(property)s', %(subgroup)s, %%(time_end)s
FROM zerver_stream FROM zerver_stream
JOIN zerver_recipient JOIN zerver_recipient
ON ON
@@ -544,67 +357,48 @@ def count_message_by_stream_query(realm: Optional[Realm]) -> QueryFn:
ON ON
zerver_message.sender_id = zerver_userprofile.id zerver_message.sender_id = zerver_userprofile.id
WHERE WHERE
zerver_stream.date_created < %(time_end)s AND zerver_stream.date_created < %%(time_end)s AND
zerver_recipient.type = 2 AND zerver_recipient.type = 2 AND
zerver_message.date_sent >= %(time_start)s AND zerver_message.pub_date >= %%(time_start)s AND
{realm_clause} zerver_message.pub_date < %%(time_end)s
zerver_message.date_sent < %(time_end)s GROUP BY zerver_stream.id %(group_by_clause)s
GROUP BY zerver_stream.id {group_by_clause}
""" """
).format(**kwargs, realm_clause=realm_clause)
# Hardcodes the query needed by active_users:is_bot:day, since that is # Hardcodes the query needed by active_users:is_bot:day, since that is
# currently the only stat that uses this. # currently the only stat that uses this.
def count_user_by_realm_query(realm: Optional[Realm]) -> QueryFn: count_user_by_realm_query = """
if realm is None:
realm_clause = SQL("")
else:
realm_clause = SQL("zerver_userprofile.realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL(
"""
INSERT INTO analytics_realmcount INSERT INTO analytics_realmcount
(realm_id, value, property, subgroup, end_time) (realm_id, value, property, subgroup, end_time)
SELECT SELECT
zerver_realm.id, count(*), %(property)s, {subgroup}, %(time_end)s zerver_realm.id, count(*),'%(property)s', %(subgroup)s, %%(time_end)s
FROM zerver_realm FROM zerver_realm
JOIN zerver_userprofile JOIN zerver_userprofile
ON ON
zerver_realm.id = zerver_userprofile.realm_id zerver_realm.id = zerver_userprofile.realm_id
WHERE WHERE
zerver_realm.date_created < %(time_end)s AND zerver_realm.date_created < %%(time_end)s AND
zerver_userprofile.date_joined >= %(time_start)s AND zerver_userprofile.date_joined >= %%(time_start)s AND
zerver_userprofile.date_joined < %(time_end)s AND zerver_userprofile.date_joined < %%(time_end)s AND
{realm_clause}
zerver_userprofile.is_active = TRUE zerver_userprofile.is_active = TRUE
GROUP BY zerver_realm.id {group_by_clause} GROUP BY zerver_realm.id %(group_by_clause)s
""" """
).format(**kwargs, realm_clause=realm_clause)
# Currently hardcodes the query needed for active_users_audit:is_bot:day. # Currently hardcodes the query needed for active_users_audit:is_bot:day.
# Assumes that a user cannot have two RealmAuditLog entries with the same event_time and # Assumes that a user cannot have two RealmAuditLog entries with the same event_time and
# event_type in [RealmAuditLog.USER_CREATED, USER_DEACTIVATED, etc]. # event_type in ['user_created', 'user_deactivated', etc].
# In particular, it's important to ensure that migrations don't cause that to happen. # In particular, it's important to ensure that migrations don't cause that to happen.
def check_realmauditlog_by_user_query(realm: Optional[Realm]) -> QueryFn: check_realmauditlog_by_user_query = """
if realm is None:
realm_clause = SQL("")
else:
realm_clause = SQL("realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL(
"""
INSERT INTO analytics_usercount INSERT INTO analytics_usercount
(user_id, realm_id, value, property, subgroup, end_time) (user_id, realm_id, value, property, subgroup, end_time)
SELECT SELECT
ral1.modified_user_id, ral1.realm_id, 1, %(property)s, {subgroup}, %(time_end)s ral1.modified_user_id, ral1.realm_id, 1, '%(property)s', %(subgroup)s, %%(time_end)s
FROM zerver_realmauditlog ral1 FROM zerver_realmauditlog ral1
JOIN ( JOIN (
SELECT modified_user_id, max(event_time) AS max_event_time SELECT modified_user_id, max(event_time) AS max_event_time
FROM zerver_realmauditlog FROM zerver_realmauditlog
WHERE WHERE
event_type in ({user_created}, {user_activated}, {user_deactivated}, {user_reactivated}) AND event_type in ('user_created', 'user_deactivated', 'user_activated', 'user_reactivated') AND
{realm_clause} event_time < %%(time_end)s
event_time < %(time_end)s
GROUP BY modified_user_id GROUP BY modified_user_id
) ral2 ) ral2
ON ON
@@ -614,212 +408,131 @@ def check_realmauditlog_by_user_query(realm: Optional[Realm]) -> QueryFn:
ON ON
ral1.modified_user_id = zerver_userprofile.id ral1.modified_user_id = zerver_userprofile.id
WHERE WHERE
ral1.event_type in ({user_created}, {user_activated}, {user_reactivated}) ral1.event_type in ('user_created', 'user_activated', 'user_reactivated')
""" """
).format(
**kwargs,
user_created=Literal(RealmAuditLog.USER_CREATED),
user_activated=Literal(RealmAuditLog.USER_ACTIVATED),
user_deactivated=Literal(RealmAuditLog.USER_DEACTIVATED),
user_reactivated=Literal(RealmAuditLog.USER_REACTIVATED),
realm_clause=realm_clause,
)
check_useractivityinterval_by_user_query = """
def check_useractivityinterval_by_user_query(realm: Optional[Realm]) -> QueryFn:
if realm is None:
realm_clause = SQL("")
else:
realm_clause = SQL("zerver_userprofile.realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL(
"""
INSERT INTO analytics_usercount INSERT INTO analytics_usercount
(user_id, realm_id, value, property, subgroup, end_time) (user_id, realm_id, value, property, subgroup, end_time)
SELECT SELECT
zerver_userprofile.id, zerver_userprofile.realm_id, 1, %(property)s, {subgroup}, %(time_end)s zerver_userprofile.id, zerver_userprofile.realm_id, 1, '%(property)s', %(subgroup)s, %%(time_end)s
FROM zerver_userprofile FROM zerver_userprofile
JOIN zerver_useractivityinterval JOIN zerver_useractivityinterval
ON ON
zerver_userprofile.id = zerver_useractivityinterval.user_profile_id zerver_userprofile.id = zerver_useractivityinterval.user_profile_id
WHERE WHERE
zerver_useractivityinterval.end >= %(time_start)s AND zerver_useractivityinterval.end >= %%(time_start)s AND
{realm_clause} zerver_useractivityinterval.start < %%(time_end)s
zerver_useractivityinterval.start < %(time_end)s GROUP BY zerver_userprofile.id %(group_by_clause)s
GROUP BY zerver_userprofile.id {group_by_clause}
""" """
).format(**kwargs, realm_clause=realm_clause)
count_realm_active_humans_query = """
def count_realm_active_humans_query(realm: Optional[Realm]) -> QueryFn:
if realm is None:
realm_clause = SQL("")
else:
realm_clause = SQL("realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL(
"""
INSERT INTO analytics_realmcount INSERT INTO analytics_realmcount
(realm_id, value, property, subgroup, end_time) (realm_id, value, property, subgroup, end_time)
SELECT SELECT
usercount1.realm_id, count(*), %(property)s, NULL, %(time_end)s usercount1.realm_id, count(*), '%(property)s', NULL, %%(time_end)s
FROM ( FROM (
SELECT realm_id, user_id SELECT realm_id, user_id
FROM analytics_usercount FROM analytics_usercount
WHERE WHERE
property = 'active_users_audit:is_bot:day' AND property = 'active_users_audit:is_bot:day' AND
subgroup = 'false' AND subgroup = 'false' AND
{realm_clause} end_time = %%(time_end)s
end_time = %(time_end)s
) usercount1 ) usercount1
JOIN ( JOIN (
SELECT realm_id, user_id SELECT realm_id, user_id
FROM analytics_usercount FROM analytics_usercount
WHERE WHERE
property = '15day_actives::day' AND property = '15day_actives::day' AND
{realm_clause} end_time = %%(time_end)s
end_time = %(time_end)s
) usercount2 ) usercount2
ON ON
usercount1.user_id = usercount2.user_id usercount1.user_id = usercount2.user_id
GROUP BY usercount1.realm_id GROUP BY usercount1.realm_id
""" """
).format(**kwargs, realm_clause=realm_clause)
# Currently unused and untested # Currently unused and untested
count_stream_by_realm_query = lambda kwargs: SQL( count_stream_by_realm_query = """
"""
INSERT INTO analytics_realmcount INSERT INTO analytics_realmcount
(realm_id, value, property, subgroup, end_time) (realm_id, value, property, subgroup, end_time)
SELECT SELECT
zerver_realm.id, count(*), %(property)s, {subgroup}, %(time_end)s zerver_realm.id, count(*), '%(property)s', %(subgroup)s, %%(time_end)s
FROM zerver_realm FROM zerver_realm
JOIN zerver_stream JOIN zerver_stream
ON ON
zerver_realm.id = zerver_stream.realm_id AND zerver_realm.id = zerver_stream.realm_id AND
WHERE WHERE
zerver_realm.date_created < %(time_end)s AND zerver_realm.date_created < %%(time_end)s AND
zerver_stream.date_created >= %(time_start)s AND zerver_stream.date_created >= %%(time_start)s AND
zerver_stream.date_created < %(time_end)s zerver_stream.date_created < %%(time_end)s
GROUP BY zerver_realm.id {group_by_clause} GROUP BY zerver_realm.id %(group_by_clause)s
""" """
).format(**kwargs)
## CountStat declarations ##
def get_count_stats(realm: Optional[Realm] = None) -> Dict[str, CountStat]: count_stats_ = [
## CountStat declarations ## # Messages Sent stats
count_stats_ = [
# Messages sent stats
# Stats that count the number of messages sent in various ways. # Stats that count the number of messages sent in various ways.
# These are also the set of stats that read from the Message table. # These are also the set of stats that read from the Message table.
CountStat(
"messages_sent:is_bot:hour", CountStat('messages_sent:is_bot:hour',
sql_data_collector( sql_data_collector(UserCount, count_message_by_user_query, (UserProfile, 'is_bot')),
UserCount, count_message_by_user_query(realm), (UserProfile, "is_bot") CountStat.HOUR),
), CountStat('messages_sent:message_type:day',
CountStat.HOUR, sql_data_collector(UserCount, count_message_type_by_user_query, None), CountStat.DAY),
), CountStat('messages_sent:client:day',
CountStat( sql_data_collector(UserCount, count_message_by_user_query, (Message, 'sending_client_id')),
"messages_sent:message_type:day", CountStat.DAY),
sql_data_collector(UserCount, count_message_type_by_user_query(realm), None), CountStat('messages_in_stream:is_bot:day',
CountStat.DAY, sql_data_collector(StreamCount, count_message_by_stream_query, (UserProfile, 'is_bot')),
), CountStat.DAY),
CountStat(
"messages_sent:client:day", # Number of Users stats
sql_data_collector(
UserCount, count_message_by_user_query(realm), (Message, "sending_client_id")
),
CountStat.DAY,
),
CountStat(
"messages_in_stream:is_bot:day",
sql_data_collector(
StreamCount, count_message_by_stream_query(realm), (UserProfile, "is_bot")
),
CountStat.DAY,
),
# Number of users stats
# Stats that count the number of active users in the UserProfile.is_active sense. # Stats that count the number of active users in the UserProfile.is_active sense.
# 'active_users_audit:is_bot:day' is the canonical record of which users were # 'active_users_audit:is_bot:day' is the canonical record of which users were
# active on which days (in the UserProfile.is_active sense). # active on which days (in the UserProfile.is_active sense).
# Important that this stay a daily stat, so that 'realm_active_humans::day' works as expected. # Important that this stay a daily stat, so that 'realm_active_humans::day' works as expected.
CountStat( CountStat('active_users_audit:is_bot:day',
"active_users_audit:is_bot:day", sql_data_collector(UserCount, check_realmauditlog_by_user_query, (UserProfile, 'is_bot')),
sql_data_collector( CountStat.DAY),
UserCount, check_realmauditlog_by_user_query(realm), (UserProfile, "is_bot")
),
CountStat.DAY,
),
# Important note: LoggingCountStat objects aren't passed the
# Realm argument, because by nature they have a logging
# structure, not a pull-from-database structure, so there's no
# way to compute them for a single realm after the fact (the
# use case for passing a Realm argument).
# Sanity check on 'active_users_audit:is_bot:day', and a archetype for future LoggingCountStats. # Sanity check on 'active_users_audit:is_bot:day', and a archetype for future LoggingCountStats.
# In RealmCount, 'active_users_audit:is_bot:day' should be the partial # In RealmCount, 'active_users_audit:is_bot:day' should be the partial
# sum sequence of 'active_users_log:is_bot:day', for any realm that # sum sequence of 'active_users_log:is_bot:day', for any realm that
# started after the latter stat was introduced. # started after the latter stat was introduced.
LoggingCountStat("active_users_log:is_bot:day", RealmCount, CountStat.DAY), LoggingCountStat('active_users_log:is_bot:day', RealmCount, CountStat.DAY),
# Another sanity check on 'active_users_audit:is_bot:day'. Is only an # Another sanity check on 'active_users_audit:is_bot:day'. Is only an
# approximation, e.g. if a user is deactivated between the end of the # approximation, e.g. if a user is deactivated between the end of the
# day and when this stat is run, they won't be counted. However, is the # day and when this stat is run, they won't be counted. However, is the
# simplest of the three to inspect by hand. # simplest of the three to inspect by hand.
CountStat( CountStat('active_users:is_bot:day',
"active_users:is_bot:day", sql_data_collector(RealmCount, count_user_by_realm_query, (UserProfile, 'is_bot')),
sql_data_collector( CountStat.DAY, interval=TIMEDELTA_MAX),
RealmCount, count_user_by_realm_query(realm), (UserProfile, "is_bot")
), # User Activity stats
CountStat.DAY,
interval=TIMEDELTA_MAX,
),
# Messages read stats. messages_read::hour is the total
# number of messages read, whereas
# messages_read_interactions::hour tries to count the total
# number of UI interactions resulting in messages being marked
# as read (imperfect because of batching of some request
# types, but less likely to be overwhelmed by a single bulk
# operation).
LoggingCountStat("messages_read::hour", UserCount, CountStat.HOUR),
LoggingCountStat("messages_read_interactions::hour", UserCount, CountStat.HOUR),
# User activity stats
# Stats that measure user activity in the UserActivityInterval sense. # Stats that measure user activity in the UserActivityInterval sense.
CountStat(
"1day_actives::day", CountStat('1day_actives::day',
sql_data_collector(UserCount, check_useractivityinterval_by_user_query(realm), None), sql_data_collector(UserCount, check_useractivityinterval_by_user_query, None),
CountStat.DAY, CountStat.DAY, interval=timedelta(days=1)-UserActivityInterval.MIN_INTERVAL_LENGTH),
interval=timedelta(days=1) - UserActivityInterval.MIN_INTERVAL_LENGTH, CountStat('15day_actives::day',
), sql_data_collector(UserCount, check_useractivityinterval_by_user_query, None),
CountStat( CountStat.DAY, interval=timedelta(days=15)-UserActivityInterval.MIN_INTERVAL_LENGTH),
"7day_actives::day", CountStat('minutes_active::day', DataCollector(UserCount, do_pull_minutes_active), CountStat.DAY),
sql_data_collector(UserCount, check_useractivityinterval_by_user_query(realm), None),
CountStat.DAY,
interval=timedelta(days=7) - UserActivityInterval.MIN_INTERVAL_LENGTH,
),
CountStat(
"15day_actives::day",
sql_data_collector(UserCount, check_useractivityinterval_by_user_query(realm), None),
CountStat.DAY,
interval=timedelta(days=15) - UserActivityInterval.MIN_INTERVAL_LENGTH,
),
CountStat(
"minutes_active::day", DataCollector(UserCount, do_pull_minutes_active), CountStat.DAY
),
# Rate limiting stats # Rate limiting stats
# Used to limit the number of invitation emails sent by a realm # Used to limit the number of invitation emails sent by a realm
LoggingCountStat("invites_sent::day", RealmCount, CountStat.DAY), LoggingCountStat('invites_sent::day', RealmCount, CountStat.DAY),
# Dependent stats # Dependent stats
# Must come after their dependencies. # Must come after their dependencies.
# Canonical account of the number of active humans in a realm on each day. # Canonical account of the number of active humans in a realm on each day.
DependentCountStat( DependentCountStat('realm_active_humans::day',
"realm_active_humans::day", sql_data_collector(RealmCount, count_realm_active_humans_query, None),
sql_data_collector(RealmCount, count_realm_active_humans_query(realm), None),
CountStat.DAY, CountStat.DAY,
dependencies=["active_users_audit:is_bot:day", "15day_actives::day"], dependencies=['active_users_audit:is_bot:day', '15day_actives::day'])
), ]
]
return OrderedDict((stat.property, stat) for stat in count_stats_) COUNT_STATS = OrderedDict([(stat.property, stat) for stat in count_stats_])
# To avoid refactoring for now COUNT_STATS can be used as before
COUNT_STATS = get_count_stats()

View File

@@ -4,19 +4,11 @@ from typing import List
from analytics.lib.counts import CountStat from analytics.lib.counts import CountStat
def generate_time_series_data(days: int=100, business_hours_base: float=10,
def generate_time_series_data( non_business_hours_base: float=10, growth: float=1,
days: int = 100, autocorrelation: float=0, spikiness: float=1,
business_hours_base: float = 10, holiday_rate: float=0, frequency: str=CountStat.DAY,
non_business_hours_base: float = 10, partial_sum: bool=False, random_seed: int=26) -> List[int]:
growth: float = 1,
autocorrelation: float = 0,
spikiness: float = 1,
holiday_rate: float = 0,
frequency: str = CountStat.DAY,
partial_sum: bool = False,
random_seed: int = 26,
) -> List[int]:
""" """
Generate semi-realistic looking time series data for testing analytics graphs. Generate semi-realistic looking time series data for testing analytics graphs.
@@ -37,43 +29,35 @@ def generate_time_series_data(
random_seed -- Seed for random number generator. random_seed -- Seed for random number generator.
""" """
if frequency == CountStat.HOUR: if frequency == CountStat.HOUR:
length = days * 24 length = days*24
seasonality = [non_business_hours_base] * 24 * 7 seasonality = [non_business_hours_base] * 24 * 7
for day in range(5): for day in range(5):
for hour in range(8): for hour in range(8):
seasonality[24 * day + hour] = business_hours_base seasonality[24*day + hour] = business_hours_base
holidays = [] holidays = []
for i in range(days): for i in range(days):
holidays.extend([random() < holiday_rate] * 24) holidays.extend([random() < holiday_rate] * 24)
elif frequency == CountStat.DAY: elif frequency == CountStat.DAY:
length = days length = days
seasonality = [8 * business_hours_base + 16 * non_business_hours_base] * 5 + [ seasonality = [8*business_hours_base + 16*non_business_hours_base] * 5 + \
24 * non_business_hours_base [24*non_business_hours_base] * 2
] * 2
holidays = [random() < holiday_rate for i in range(days)] holidays = [random() < holiday_rate for i in range(days)]
else: else:
raise AssertionError(f"Unknown frequency: {frequency}") raise AssertionError("Unknown frequency: %s" % (frequency,))
if length < 2: if length < 2:
raise AssertionError( raise AssertionError("Must be generating at least 2 data points. "
f"Must be generating at least 2 data points. Currently generating {length}" "Currently generating %s" % (length,))
) growth_base = growth ** (1. / (length-1))
growth_base = growth ** (1.0 / (length - 1)) values_no_noise = [seasonality[i % len(seasonality)] * (growth_base**i) for i in range(length)]
values_no_noise = [
seasonality[i % len(seasonality)] * (growth_base ** i) for i in range(length)
]
seed(random_seed) seed(random_seed)
noise_scalars = [gauss(0, 1)] noise_scalars = [gauss(0, 1)]
for i in range(1, length): for i in range(1, length):
noise_scalars.append( noise_scalars.append(noise_scalars[-1]*autocorrelation + gauss(0, 1)*(1-autocorrelation))
noise_scalars[-1] * autocorrelation + gauss(0, 1) * (1 - autocorrelation)
)
values = [ values = [0 if holiday else int(v + sqrt(v)*noise_scalar*spikiness)
0 if holiday else int(v + sqrt(v) * noise_scalar * spikiness) for v, noise_scalar, holiday in zip(values_no_noise, noise_scalars, holidays)]
for v, noise_scalar, holiday in zip(values_no_noise, noise_scalars, holidays)
]
if partial_sum: if partial_sum:
for i in range(1, length): for i in range(1, length):
values[i] = values[i - 1] + values[i] values[i] = values[i-1] + values[i]
return [max(v, 0) for v in values] return [max(v, 0) for v in values]

View File

@@ -4,14 +4,12 @@ from typing import List, Optional
from analytics.lib.counts import CountStat from analytics.lib.counts import CountStat
from zerver.lib.timestamp import floor_to_day, floor_to_hour, verify_UTC from zerver.lib.timestamp import floor_to_day, floor_to_hour, verify_UTC
# If min_length is None, returns end_times from ceiling(start) to floor(end), inclusive. # If min_length is None, returns end_times from ceiling(start) to floor(end), inclusive.
# If min_length is greater than 0, pads the list to the left. # If min_length is greater than 0, pads the list to the left.
# So informally, time_range(Sep 20, Sep 22, day, None) returns [Sep 20, Sep 21, Sep 22], # So informally, time_range(Sep 20, Sep 22, day, None) returns [Sep 20, Sep 21, Sep 22],
# and time_range(Sep 20, Sep 22, day, 5) returns [Sep 18, Sep 19, Sep 20, Sep 21, Sep 22] # and time_range(Sep 20, Sep 22, day, 5) returns [Sep 18, Sep 19, Sep 20, Sep 21, Sep 22]
def time_range( def time_range(start: datetime, end: datetime, frequency: str,
start: datetime, end: datetime, frequency: str, min_length: Optional[int] min_length: Optional[int]) -> List[datetime]:
) -> List[datetime]:
verify_UTC(start) verify_UTC(start)
verify_UTC(end) verify_UTC(end)
if frequency == CountStat.HOUR: if frequency == CountStat.HOUR:
@@ -21,11 +19,11 @@ def time_range(
end = floor_to_day(end) end = floor_to_day(end)
step = timedelta(days=1) step = timedelta(days=1)
else: else:
raise AssertionError(f"Unknown frequency: {frequency}") raise AssertionError("Unknown frequency: %s" % (frequency,))
times = [] times = []
if min_length is not None: if min_length is not None:
start = min(start, end - (min_length - 1) * step) start = min(start, end - (min_length-1)*step)
current = end current = end
while current >= start: while current >= start:
times.append(current) times.append(current)

View File

@@ -0,0 +1,81 @@
import datetime
import logging
import time
from typing import Any, Dict
from django.core.management.base import BaseCommand, CommandParser
from zerver.lib.timestamp import timestamp_to_datetime
from zerver.models import Message, Recipient
def compute_stats(log_level: int) -> None:
logger = logging.getLogger()
logger.setLevel(log_level)
one_week_ago = timestamp_to_datetime(time.time()) - datetime.timedelta(weeks=1)
mit_query = Message.objects.filter(sender__realm__string_id="zephyr",
recipient__type=Recipient.STREAM,
pub_date__gt=one_week_ago)
for bot_sender_start in ["imap.", "rcmd.", "sys."]:
mit_query = mit_query.exclude(sender__email__startswith=(bot_sender_start))
# Filtering for "/" covers tabbott/extra@ and all the daemon/foo bots.
mit_query = mit_query.exclude(sender__email__contains=("/"))
mit_query = mit_query.exclude(sender__email__contains=("aim.com"))
mit_query = mit_query.exclude(
sender__email__in=["rss@mit.edu", "bash@mit.edu", "apache@mit.edu",
"bitcoin@mit.edu", "lp@mit.edu", "clocks@mit.edu",
"root@mit.edu", "nagios@mit.edu",
"www-data|local-realm@mit.edu"])
user_counts = {} # type: Dict[str, Dict[str, int]]
for m in mit_query.select_related("sending_client", "sender"):
email = m.sender.email
user_counts.setdefault(email, {})
user_counts[email].setdefault(m.sending_client.name, 0)
user_counts[email][m.sending_client.name] += 1
total_counts = {} # type: Dict[str, int]
total_user_counts = {} # type: Dict[str, int]
for email, counts in user_counts.items():
total_user_counts.setdefault(email, 0)
for client_name, count in counts.items():
total_counts.setdefault(client_name, 0)
total_counts[client_name] += count
total_user_counts[email] += count
logging.debug("%40s | %10s | %s" % ("User", "Messages", "Percentage Zulip"))
top_percents = {} # type: Dict[int, float]
for size in [10, 25, 50, 100, 200, len(total_user_counts.keys())]:
top_percents[size] = 0.0
for i, email in enumerate(sorted(total_user_counts.keys(),
key=lambda x: -total_user_counts[x])):
percent_zulip = round(100 - (user_counts[email].get("zephyr_mirror", 0)) * 100. /
total_user_counts[email], 1)
for size in top_percents.keys():
top_percents.setdefault(size, 0)
if i < size:
top_percents[size] += (percent_zulip * 1.0 / size)
logging.debug("%40s | %10s | %s%%" % (email, total_user_counts[email],
percent_zulip))
logging.info("")
for size in sorted(top_percents.keys()):
logging.info("Top %6s | %s%%" % (size, round(top_percents[size], 1)))
grand_total = sum(total_counts.values())
print(grand_total)
logging.info("%15s | %s" % ("Client", "Percentage"))
for client in total_counts.keys():
logging.info("%15s | %s%%" % (client, round(100. * total_counts[client] / grand_total, 1)))
class Command(BaseCommand):
help = "Compute statistics on MIT Zephyr usage."
def add_arguments(self, parser: CommandParser) -> None:
parser.add_argument('--verbose', default=False, action='store_true')
def handle(self, *args: Any, **options: Any) -> None:
level = logging.INFO
if options["verbose"]:
level = logging.DEBUG
compute_stats(level)

View File

@@ -0,0 +1,56 @@
import datetime
from typing import Any, Dict
from django.core.management.base import BaseCommand, CommandParser
from django.utils.timezone import utc
from zerver.lib.statistics import seconds_usage_between
from zerver.models import UserProfile
def analyze_activity(options: Dict[str, Any]) -> None:
day_start = datetime.datetime.strptime(options["date"], "%Y-%m-%d").replace(tzinfo=utc)
day_end = day_start + datetime.timedelta(days=options["duration"])
user_profile_query = UserProfile.objects.all()
if options["realm"]:
user_profile_query = user_profile_query.filter(realm__string_id=options["realm"])
print("Per-user online duration:\n")
total_duration = datetime.timedelta(0)
for user_profile in user_profile_query:
duration = seconds_usage_between(user_profile, day_start, day_end)
if duration == datetime.timedelta(0):
continue
total_duration += duration
print("%-*s%s" % (37, user_profile.email, duration,))
print("\nTotal Duration: %s" % (total_duration,))
print("\nTotal Duration in minutes: %s" % (total_duration.total_seconds() / 60.,))
print("Total Duration amortized to a month: %s" % (total_duration.total_seconds() * 30. / 60.,))
class Command(BaseCommand):
help = """Report analytics of user activity on a per-user and realm basis.
This command aggregates user activity data that is collected by each user using Zulip. It attempts
to approximate how much each user has been using Zulip per day, measured by recording each 15 minute
period where some activity has occurred (mouse move or keyboard activity).
It will correctly not count server-initiated reloads in the activity statistics.
The duration flag can be used to control how many days to show usage duration for
Usage: ./manage.py analyze_user_activity [--realm=zulip] [--date=2013-09-10] [--duration=1]
By default, if no date is selected 2013-09-10 is used. If no realm is provided, information
is shown for all realms"""
def add_arguments(self, parser: CommandParser) -> None:
parser.add_argument('--realm', action='store')
parser.add_argument('--date', action='store', default="2013-09-06")
parser.add_argument('--duration', action='store', default=1, type=int,
help="How many days to show usage information for")
def handle(self, *args: Any, **options: Any) -> None:
analyze_activity(options)

View File

@@ -1,24 +1,26 @@
import os
import time
from datetime import timedelta from datetime import timedelta
from typing import Any, Dict
from django.core.management.base import BaseCommand from django.core.management.base import BaseCommand
from django.utils.timezone import now as timezone_now from django.utils.timezone import now as timezone_now
from analytics.models import installation_epoch, \
last_successful_fill
from analytics.lib.counts import COUNT_STATS, CountStat from analytics.lib.counts import COUNT_STATS, CountStat
from analytics.models import installation_epoch from zerver.lib.timestamp import floor_to_hour, floor_to_day, verify_UTC, \
from zerver.lib.timestamp import TimezoneNotUTCException, floor_to_day, floor_to_hour, verify_UTC TimezoneNotUTCException
from zerver.models import Realm from zerver.models import Realm
import os
import time
from typing import Any, Dict
states = { states = {
0: "OK", 0: "OK",
1: "WARNING", 1: "WARNING",
2: "CRITICAL", 2: "CRITICAL",
3: "UNKNOWN", 3: "UNKNOWN"
} }
class Command(BaseCommand): class Command(BaseCommand):
help = """Checks FillState table. help = """Checks FillState table.
@@ -26,30 +28,31 @@ class Command(BaseCommand):
def handle(self, *args: Any, **options: Any) -> None: def handle(self, *args: Any, **options: Any) -> None:
fill_state = self.get_fill_state() fill_state = self.get_fill_state()
status = fill_state["status"] status = fill_state['status']
message = fill_state["message"] message = fill_state['message']
state_file_path = "/var/lib/nagios_state/check-analytics-state" state_file_path = "/var/lib/nagios_state/check-analytics-state"
state_file_tmp = state_file_path + "-tmp" state_file_tmp = state_file_path + "-tmp"
with open(state_file_tmp, "w") as f: with open(state_file_tmp, "w") as f:
f.write(f"{int(time.time())}|{status}|{states[status]}|{message}\n") f.write("%s|%s|%s|%s\n" % (
int(time.time()), status, states[status], message))
os.rename(state_file_tmp, state_file_path) os.rename(state_file_tmp, state_file_path)
def get_fill_state(self) -> Dict[str, Any]: def get_fill_state(self) -> Dict[str, Any]:
if not Realm.objects.exists(): if not Realm.objects.exists():
return {"status": 0, "message": "No realms exist, so not checking FillState."} return {'status': 0, 'message': 'No realms exist, so not checking FillState.'}
warning_unfilled_properties = [] warning_unfilled_properties = []
critical_unfilled_properties = [] critical_unfilled_properties = []
for property, stat in COUNT_STATS.items(): for property, stat in COUNT_STATS.items():
last_fill = stat.last_successful_fill() last_fill = last_successful_fill(property)
if last_fill is None: if last_fill is None:
last_fill = installation_epoch() last_fill = installation_epoch()
try: try:
verify_UTC(last_fill) verify_UTC(last_fill)
except TimezoneNotUTCException: except TimezoneNotUTCException:
return {"status": 2, "message": f"FillState not in UTC for {property}"} return {'status': 2, 'message': 'FillState not in UTC for %s' % (property,)}
if stat.frequency == CountStat.DAY: if stat.frequency == CountStat.DAY:
floor_function = floor_to_day floor_function = floor_to_day
@@ -61,10 +64,8 @@ class Command(BaseCommand):
critical_threshold = timedelta(minutes=150) critical_threshold = timedelta(minutes=150)
if floor_function(last_fill) != last_fill: if floor_function(last_fill) != last_fill:
return { return {'status': 2, 'message': 'FillState not on %s boundary for %s' %
"status": 2, (stat.frequency, property)}
"message": f"FillState not on {stat.frequency} boundary for {property}",
}
time_to_last_fill = timezone_now() - last_fill time_to_last_fill = timezone_now() - last_fill
if time_to_last_fill > critical_threshold: if time_to_last_fill > critical_threshold:
@@ -73,18 +74,9 @@ class Command(BaseCommand):
warning_unfilled_properties.append(property) warning_unfilled_properties.append(property)
if len(critical_unfilled_properties) == 0 and len(warning_unfilled_properties) == 0: if len(critical_unfilled_properties) == 0 and len(warning_unfilled_properties) == 0:
return {"status": 0, "message": "FillState looks fine."} return {'status': 0, 'message': 'FillState looks fine.'}
if len(critical_unfilled_properties) == 0: if len(critical_unfilled_properties) == 0:
return { return {'status': 1, 'message': 'Missed filling %s once.' %
"status": 1, (', '.join(warning_unfilled_properties),)}
"message": "Missed filling {} once.".format( return {'status': 2, 'message': 'Missed filling %s once. Missed filling %s at least twice.' %
", ".join(warning_unfilled_properties), (', '.join(warning_unfilled_properties), ', '.join(critical_unfilled_properties))}
),
}
return {
"status": 2,
"message": "Missed filling {} once. Missed filling {} at least twice.".format(
", ".join(warning_unfilled_properties),
", ".join(critical_unfilled_properties),
),
}

View File

@@ -1,21 +1,22 @@
import sys
from argparse import ArgumentParser from argparse import ArgumentParser
from typing import Any from typing import Any
from django.core.management.base import BaseCommand, CommandError from django.core.management.base import BaseCommand
from analytics.lib.counts import do_drop_all_analytics_tables from analytics.lib.counts import do_drop_all_analytics_tables
class Command(BaseCommand): class Command(BaseCommand):
help = """Clear analytics tables.""" help = """Clear analytics tables."""
def add_arguments(self, parser: ArgumentParser) -> None: def add_arguments(self, parser: ArgumentParser) -> None:
parser.add_argument("--force", action="store_true", help="Clear analytics tables.") parser.add_argument('--force',
action='store_true',
help="Clear analytics tables.")
def handle(self, *args: Any, **options: Any) -> None: def handle(self, *args: Any, **options: Any) -> None:
if options["force"]: if options['force']:
do_drop_all_analytics_tables() do_drop_all_analytics_tables()
else: else:
raise CommandError( print("Would delete all data from analytics tables (!); use --force to do so.")
"Would delete all data from analytics tables (!); use --force to do so." sys.exit(1)
)

View File

@@ -1,23 +1,29 @@
import sys
from argparse import ArgumentParser from argparse import ArgumentParser
from typing import Any from typing import Any
from django.core.management.base import BaseCommand, CommandError from django.core.management.base import BaseCommand
from analytics.lib.counts import COUNT_STATS, do_drop_single_stat from analytics.lib.counts import COUNT_STATS, do_drop_single_stat
class Command(BaseCommand): class Command(BaseCommand):
help = """Clear analytics tables.""" help = """Clear analytics tables."""
def add_arguments(self, parser: ArgumentParser) -> None: def add_arguments(self, parser: ArgumentParser) -> None:
parser.add_argument("--force", action="store_true", help="Actually do it.") parser.add_argument('--force',
parser.add_argument("--property", help="The property of the stat to be cleared.") action='store_true',
help="Actually do it.")
parser.add_argument('--property',
type=str,
help="The property of the stat to be cleared.")
def handle(self, *args: Any, **options: Any) -> None: def handle(self, *args: Any, **options: Any) -> None:
property = options["property"] property = options['property']
if property not in COUNT_STATS: if property not in COUNT_STATS:
raise CommandError(f"Invalid property: {property}") print("Invalid property: %s" % (property,))
if not options["force"]: sys.exit(1)
raise CommandError("No action taken. Use --force.") if not options['force']:
print("No action taken. Use --force.")
sys.exit(1)
do_drop_single_stat(property) do_drop_single_stat(property)

View File

@@ -0,0 +1,73 @@
import datetime
from argparse import ArgumentParser
from typing import Any
from django.db.models import Count, QuerySet
from django.utils.timezone import now as timezone_now
from zerver.lib.management import ZulipBaseCommand
from zerver.models import UserActivity
class Command(ZulipBaseCommand):
help = """Report rough client activity globally, for a realm, or for a user
Usage examples:
./manage.py client_activity --target server
./manage.py client_activity --target realm --realm zulip
./manage.py client_activity --target user --user hamlet@zulip.com --realm zulip"""
def add_arguments(self, parser: ArgumentParser) -> None:
parser.add_argument('--target', dest='target', required=True, type=str,
help="'server' will calculate client activity of the entire server. "
"'realm' will calculate client activity of realm. "
"'user' will calculate client activity of the user.")
parser.add_argument('--user', dest='user', type=str,
help="The email address of the user you want to calculate activity.")
self.add_realm_args(parser)
def compute_activity(self, user_activity_objects: QuerySet) -> None:
# Report data from the past week.
#
# This is a rough report of client activity because we inconsistently
# register activity from various clients; think of it as telling you
# approximately how many people from a group have used a particular
# client recently. For example, this might be useful to get a sense of
# how popular different versions of a desktop client are.
#
# Importantly, this does NOT tell you anything about the relative
# volumes of requests from clients.
threshold = timezone_now() - datetime.timedelta(days=7)
client_counts = user_activity_objects.filter(
last_visit__gt=threshold).values("client__name").annotate(
count=Count('client__name'))
total = 0
counts = []
for client_type in client_counts:
count = client_type["count"]
client = client_type["client__name"]
total += count
counts.append((count, client))
counts.sort()
for count in counts:
print("%25s %15d" % (count[1], count[0]))
print("Total:", total)
def handle(self, *args: Any, **options: str) -> None:
realm = self.get_realm(options)
if options["user"] is None:
if options["target"] == "server" and realm is None:
# Report global activity.
self.compute_activity(UserActivity.objects.all())
elif options["target"] == "realm" and realm is not None:
self.compute_activity(UserActivity.objects.filter(user_profile__realm=realm))
else:
self.print_help("./manage.py", "client_activity")
elif options["target"] == "user":
user_profile = self.get_user(options["user"], realm)
self.compute_activity(UserActivity.objects.filter(user_profile=user_profile))
else:
self.print_help("./manage.py", "client_activity")

View File

@@ -1,26 +1,20 @@
from datetime import timedelta
from datetime import datetime, timedelta
from typing import Any, Dict, List, Mapping, Optional, Type from typing import Any, Dict, List, Mapping, Optional, Type
from unittest import mock
from django.core.management.base import BaseCommand from django.core.management.base import BaseCommand
from django.utils.timezone import now as timezone_now from django.utils.timezone import now as timezone_now
from analytics.lib.counts import COUNT_STATS, CountStat, do_drop_all_analytics_tables from analytics.lib.counts import COUNT_STATS, \
CountStat, do_drop_all_analytics_tables
from analytics.lib.fixtures import generate_time_series_data from analytics.lib.fixtures import generate_time_series_data
from analytics.lib.time_utils import time_range from analytics.lib.time_utils import time_range
from analytics.models import ( from analytics.models import BaseCount, FillState, RealmCount, UserCount, \
BaseCount, StreamCount, InstallationCount
FillState, from zerver.lib.actions import do_change_is_admin, STREAM_ASSIGNMENT_COLORS
InstallationCount,
RealmCount,
StreamCount,
UserCount,
)
from zerver.lib.actions import STREAM_ASSIGNMENT_COLORS, do_change_user_role, do_create_realm
from zerver.lib.create_user import create_user
from zerver.lib.timestamp import floor_to_day from zerver.lib.timestamp import floor_to_day
from zerver.models import Client, Realm, Recipient, Stream, Subscription, UserProfile from zerver.models import Realm, UserProfile, Stream, Client, \
RealmAuditLog, Recipient, Subscription
class Command(BaseCommand): class Command(BaseCommand):
help = """Populates analytics tables with randomly generated data.""" help = """Populates analytics tables with randomly generated data."""
@@ -28,30 +22,30 @@ class Command(BaseCommand):
DAYS_OF_DATA = 100 DAYS_OF_DATA = 100
random_seed = 26 random_seed = 26
def generate_fixture_data( def create_user(self, email: str,
self, full_name: str,
stat: CountStat, is_staff: bool,
business_hours_base: float, date_joined: datetime,
non_business_hours_base: float, realm: Realm) -> UserProfile:
growth: float, user = UserProfile.objects.create(
autocorrelation: float, email=email, full_name=full_name, is_staff=is_staff,
spikiness: float, realm=realm, short_name=full_name, pointer=-1, last_pointer_updater='none',
holiday_rate: float = 0, api_key='42', date_joined=date_joined)
partial_sum: bool = False, RealmAuditLog.objects.create(
) -> List[int]: realm=realm, modified_user=user, event_type=RealmAuditLog.USER_CREATED,
event_time=user.date_joined)
return user
def generate_fixture_data(self, stat: CountStat, business_hours_base: float,
non_business_hours_base: float, growth: float,
autocorrelation: float, spikiness: float,
holiday_rate: float=0, partial_sum: bool=False) -> List[int]:
self.random_seed += 1 self.random_seed += 1
return generate_time_series_data( return generate_time_series_data(
days=self.DAYS_OF_DATA, days=self.DAYS_OF_DATA, business_hours_base=business_hours_base,
business_hours_base=business_hours_base, non_business_hours_base=non_business_hours_base, growth=growth,
non_business_hours_base=non_business_hours_base, autocorrelation=autocorrelation, spikiness=spikiness, holiday_rate=holiday_rate,
growth=growth, frequency=stat.frequency, partial_sum=partial_sum, random_seed=self.random_seed)
autocorrelation=autocorrelation,
spikiness=spikiness,
holiday_rate=holiday_rate,
frequency=stat.frequency,
partial_sum=partial_sum,
random_seed=self.random_seed,
)
def handle(self, *args: Any, **options: Any) -> None: def handle(self, *args: Any, **options: Any) -> None:
# TODO: This should arguably only delete the objects # TODO: This should arguably only delete the objects
@@ -59,7 +53,7 @@ class Command(BaseCommand):
do_drop_all_analytics_tables() do_drop_all_analytics_tables()
# This also deletes any objects with this realm as a foreign key # This also deletes any objects with this realm as a foreign key
Realm.objects.filter(string_id="analytics").delete() Realm.objects.filter(string_id='analytics').delete()
# Because we just deleted a bunch of objects in the database # Because we just deleted a bunch of objects in the database
# directly (rather than deleting individual objects in Django, # directly (rather than deleting individual objects in Django,
@@ -68,233 +62,163 @@ class Command(BaseCommand):
# memcached in order to ensure deleted objects aren't still # memcached in order to ensure deleted objects aren't still
# present in the memcached cache. # present in the memcached cache.
from zerver.apps import flush_cache from zerver.apps import flush_cache
flush_cache(None) flush_cache(None)
installation_time = timezone_now() - timedelta(days=self.DAYS_OF_DATA) installation_time = timezone_now() - timedelta(days=self.DAYS_OF_DATA)
last_end_time = floor_to_day(timezone_now()) last_end_time = floor_to_day(timezone_now())
realm = do_create_realm( realm = Realm.objects.create(
string_id="analytics", name="Analytics", date_created=installation_time string_id='analytics', name='Analytics', date_created=installation_time)
) shylock = self.create_user('shylock@analytics.ds', 'Shylock', True, installation_time, realm)
do_change_is_admin(shylock, True)
with mock.patch("zerver.lib.create_user.timezone_now", return_value=installation_time): stream = Stream.objects.create(
shylock = create_user( name='all', realm=realm, date_created=installation_time)
"shylock@analytics.ds",
"Shylock",
realm,
full_name="Shylock",
role=UserProfile.ROLE_REALM_OWNER,
)
do_change_user_role(shylock, UserProfile.ROLE_REALM_OWNER, acting_user=None)
stream = Stream.objects.create(name="all", realm=realm, date_created=installation_time)
recipient = Recipient.objects.create(type_id=stream.id, type=Recipient.STREAM) recipient = Recipient.objects.create(type_id=stream.id, type=Recipient.STREAM)
stream.recipient = recipient
stream.save(update_fields=["recipient"])
# Subscribe shylock to the stream to avoid invariant failures. # Subscribe shylock to the stream to avoid invariant failures.
# TODO: This should use subscribe_users_to_streams from populate_db. # TODO: This should use subscribe_users_to_streams from populate_db.
subs = [ subs = [
Subscription( Subscription(recipient=recipient,
recipient=recipient,
user_profile=shylock, user_profile=shylock,
is_user_active=shylock.is_active, color=STREAM_ASSIGNMENT_COLORS[0]),
color=STREAM_ASSIGNMENT_COLORS[0],
),
] ]
Subscription.objects.bulk_create(subs) Subscription.objects.bulk_create(subs)
def insert_fixture_data( def insert_fixture_data(stat: CountStat,
stat: CountStat, fixture_data: Mapping[Optional[str], List[int]], table: Type[BaseCount] fixture_data: Mapping[Optional[str], List[int]],
) -> None: table: Type[BaseCount]) -> None:
end_times = time_range( end_times = time_range(last_end_time, last_end_time, stat.frequency,
last_end_time, last_end_time, stat.frequency, len(list(fixture_data.values())[0]) len(list(fixture_data.values())[0]))
)
if table == InstallationCount: if table == InstallationCount:
id_args: Dict[str, Any] = {} id_args = {} # type: Dict[str, Any]
if table == RealmCount: if table == RealmCount:
id_args = {"realm": realm} id_args = {'realm': realm}
if table == UserCount: if table == UserCount:
id_args = {"realm": realm, "user": shylock} id_args = {'realm': realm, 'user': shylock}
if table == StreamCount: if table == StreamCount:
id_args = {"stream": stream, "realm": realm} id_args = {'stream': stream, 'realm': realm}
for subgroup, values in fixture_data.items(): for subgroup, values in fixture_data.items():
table.objects.bulk_create( table.objects.bulk_create([
table( table(property=stat.property, subgroup=subgroup, end_time=end_time,
property=stat.property, value=value, **id_args)
subgroup=subgroup, for end_time, value in zip(end_times, values) if value != 0])
end_time=end_time,
value=value,
**id_args,
)
for end_time, value in zip(end_times, values)
if value != 0
)
stat = COUNT_STATS["1day_actives::day"] stat = COUNT_STATS['1day_actives::day']
realm_data: Mapping[Optional[str], List[int]] = {
None: self.generate_fixture_data(stat, 0.08, 0.02, 3, 0.3, 6, partial_sum=True),
}
insert_fixture_data(stat, realm_data, RealmCount)
installation_data: Mapping[Optional[str], List[int]] = {
None: self.generate_fixture_data(stat, 0.8, 0.2, 4, 0.3, 6, partial_sum=True),
}
insert_fixture_data(stat, installation_data, InstallationCount)
FillState.objects.create(
property=stat.property, end_time=last_end_time, state=FillState.DONE
)
stat = COUNT_STATS["7day_actives::day"]
realm_data = { realm_data = {
None: self.generate_fixture_data(stat, 0.2, 0.07, 3, 0.3, 6, partial_sum=True), None: self.generate_fixture_data(stat, .08, .02, 3, .3, 6, partial_sum=True),
} # type: Mapping[Optional[str], List[int]]
insert_fixture_data(stat, realm_data, RealmCount)
installation_data = {
None: self.generate_fixture_data(stat, .8, .2, 4, .3, 6, partial_sum=True),
} # type: Mapping[Optional[str], List[int]]
insert_fixture_data(stat, installation_data, InstallationCount)
FillState.objects.create(property=stat.property, end_time=last_end_time,
state=FillState.DONE)
stat = COUNT_STATS['realm_active_humans::day']
realm_data = {
None: self.generate_fixture_data(stat, .1, .03, 3, .5, 3, partial_sum=True),
} }
insert_fixture_data(stat, realm_data, RealmCount) insert_fixture_data(stat, realm_data, RealmCount)
installation_data = { installation_data = {
None: self.generate_fixture_data(stat, 2, 0.7, 4, 0.3, 6, partial_sum=True), None: self.generate_fixture_data(stat, 1, .3, 4, .5, 3, partial_sum=True),
} }
insert_fixture_data(stat, installation_data, InstallationCount) insert_fixture_data(stat, installation_data, InstallationCount)
FillState.objects.create( FillState.objects.create(property=stat.property, end_time=last_end_time,
property=stat.property, end_time=last_end_time, state=FillState.DONE state=FillState.DONE)
)
stat = COUNT_STATS["realm_active_humans::day"] stat = COUNT_STATS['active_users_audit:is_bot:day']
realm_data = { realm_data = {
None: self.generate_fixture_data(stat, 0.8, 0.08, 3, 0.5, 3, partial_sum=True), 'false': self.generate_fixture_data(stat, .1, .03, 3.5, .8, 2, partial_sum=True),
} }
insert_fixture_data(stat, realm_data, RealmCount) insert_fixture_data(stat, realm_data, RealmCount)
installation_data = { installation_data = {
None: self.generate_fixture_data(stat, 1, 0.3, 4, 0.5, 3, partial_sum=True), 'false': self.generate_fixture_data(stat, 1, .3, 6, .8, 2, partial_sum=True),
} }
insert_fixture_data(stat, installation_data, InstallationCount) insert_fixture_data(stat, installation_data, InstallationCount)
FillState.objects.create( FillState.objects.create(property=stat.property, end_time=last_end_time,
property=stat.property, end_time=last_end_time, state=FillState.DONE state=FillState.DONE)
)
stat = COUNT_STATS["active_users_audit:is_bot:day"] stat = COUNT_STATS['messages_sent:is_bot:hour']
realm_data = { user_data = {'false': self.generate_fixture_data(
"false": self.generate_fixture_data(stat, 1, 0.2, 3.5, 0.8, 2, partial_sum=True), stat, 2, 1, 1.5, .6, 8, holiday_rate=.1)} # type: Mapping[Optional[str], List[int]]
"true": self.generate_fixture_data(stat, 0.3, 0.05, 3, 0.3, 2, partial_sum=True),
}
insert_fixture_data(stat, realm_data, RealmCount)
installation_data = {
"false": self.generate_fixture_data(stat, 3, 1, 4, 0.8, 2, partial_sum=True),
"true": self.generate_fixture_data(stat, 1, 0.4, 4, 0.8, 2, partial_sum=True),
}
insert_fixture_data(stat, installation_data, InstallationCount)
FillState.objects.create(
property=stat.property, end_time=last_end_time, state=FillState.DONE
)
stat = COUNT_STATS["messages_sent:is_bot:hour"]
user_data: Mapping[Optional[str], List[int]] = {
"false": self.generate_fixture_data(stat, 2, 1, 1.5, 0.6, 8, holiday_rate=0.1),
}
insert_fixture_data(stat, user_data, UserCount) insert_fixture_data(stat, user_data, UserCount)
realm_data = { realm_data = {'false': self.generate_fixture_data(stat, 35, 15, 6, .6, 4),
"false": self.generate_fixture_data(stat, 35, 15, 6, 0.6, 4), 'true': self.generate_fixture_data(stat, 15, 15, 3, .4, 2)}
"true": self.generate_fixture_data(stat, 15, 15, 3, 0.4, 2),
}
insert_fixture_data(stat, realm_data, RealmCount) insert_fixture_data(stat, realm_data, RealmCount)
installation_data = { installation_data = {'false': self.generate_fixture_data(stat, 350, 150, 6, .6, 4),
"false": self.generate_fixture_data(stat, 350, 150, 6, 0.6, 4), 'true': self.generate_fixture_data(stat, 150, 150, 3, .4, 2)}
"true": self.generate_fixture_data(stat, 150, 150, 3, 0.4, 2),
}
insert_fixture_data(stat, installation_data, InstallationCount) insert_fixture_data(stat, installation_data, InstallationCount)
FillState.objects.create( FillState.objects.create(property=stat.property, end_time=last_end_time,
property=stat.property, end_time=last_end_time, state=FillState.DONE state=FillState.DONE)
)
stat = COUNT_STATS["messages_sent:message_type:day"] stat = COUNT_STATS['messages_sent:message_type:day']
user_data = { user_data = {
"public_stream": self.generate_fixture_data(stat, 1.5, 1, 3, 0.6, 8), 'public_stream': self.generate_fixture_data(stat, 1.5, 1, 3, .6, 8),
"private_message": self.generate_fixture_data(stat, 0.5, 0.3, 1, 0.6, 8), 'private_message': self.generate_fixture_data(stat, .5, .3, 1, .6, 8),
"huddle_message": self.generate_fixture_data(stat, 0.2, 0.2, 2, 0.6, 8), 'huddle_message': self.generate_fixture_data(stat, .2, .2, 2, .6, 8)}
}
insert_fixture_data(stat, user_data, UserCount) insert_fixture_data(stat, user_data, UserCount)
realm_data = { realm_data = {
"public_stream": self.generate_fixture_data(stat, 30, 8, 5, 0.6, 4), 'public_stream': self.generate_fixture_data(stat, 30, 8, 5, .6, 4),
"private_stream": self.generate_fixture_data(stat, 7, 7, 5, 0.6, 4), 'private_stream': self.generate_fixture_data(stat, 7, 7, 5, .6, 4),
"private_message": self.generate_fixture_data(stat, 13, 5, 5, 0.6, 4), 'private_message': self.generate_fixture_data(stat, 13, 5, 5, .6, 4),
"huddle_message": self.generate_fixture_data(stat, 6, 3, 3, 0.6, 4), 'huddle_message': self.generate_fixture_data(stat, 6, 3, 3, .6, 4)}
}
insert_fixture_data(stat, realm_data, RealmCount) insert_fixture_data(stat, realm_data, RealmCount)
installation_data = { installation_data = {
"public_stream": self.generate_fixture_data(stat, 300, 80, 5, 0.6, 4), 'public_stream': self.generate_fixture_data(stat, 300, 80, 5, .6, 4),
"private_stream": self.generate_fixture_data(stat, 70, 70, 5, 0.6, 4), 'private_stream': self.generate_fixture_data(stat, 70, 70, 5, .6, 4),
"private_message": self.generate_fixture_data(stat, 130, 50, 5, 0.6, 4), 'private_message': self.generate_fixture_data(stat, 130, 50, 5, .6, 4),
"huddle_message": self.generate_fixture_data(stat, 60, 30, 3, 0.6, 4), 'huddle_message': self.generate_fixture_data(stat, 60, 30, 3, .6, 4)}
}
insert_fixture_data(stat, installation_data, InstallationCount) insert_fixture_data(stat, installation_data, InstallationCount)
FillState.objects.create( FillState.objects.create(property=stat.property, end_time=last_end_time,
property=stat.property, end_time=last_end_time, state=FillState.DONE state=FillState.DONE)
)
website, created = Client.objects.get_or_create(name="website") website, created = Client.objects.get_or_create(name='website')
old_desktop, created = Client.objects.get_or_create(name="desktop app Linux 0.3.7") old_desktop, created = Client.objects.get_or_create(name='desktop app Linux 0.3.7')
android, created = Client.objects.get_or_create(name="ZulipAndroid") android, created = Client.objects.get_or_create(name='ZulipAndroid')
iOS, created = Client.objects.get_or_create(name="ZulipiOS") iOS, created = Client.objects.get_or_create(name='ZulipiOS')
react_native, created = Client.objects.get_or_create(name="ZulipMobile") react_native, created = Client.objects.get_or_create(name='ZulipMobile')
API, created = Client.objects.get_or_create(name="API: Python") API, created = Client.objects.get_or_create(name='API: Python')
zephyr_mirror, created = Client.objects.get_or_create(name="zephyr_mirror") zephyr_mirror, created = Client.objects.get_or_create(name='zephyr_mirror')
unused, created = Client.objects.get_or_create(name="unused") unused, created = Client.objects.get_or_create(name='unused')
long_webhook, created = Client.objects.get_or_create(name="ZulipLooooooooooongNameWebhook") long_webhook, created = Client.objects.get_or_create(name='ZulipLooooooooooongNameWebhook')
stat = COUNT_STATS["messages_sent:client:day"] stat = COUNT_STATS['messages_sent:client:day']
user_data = { user_data = {
website.id: self.generate_fixture_data(stat, 2, 1, 1.5, 0.6, 8), website.id: self.generate_fixture_data(stat, 2, 1, 1.5, .6, 8),
zephyr_mirror.id: self.generate_fixture_data(stat, 0, 0.3, 1.5, 0.6, 8), zephyr_mirror.id: self.generate_fixture_data(stat, 0, .3, 1.5, .6, 8)}
}
insert_fixture_data(stat, user_data, UserCount) insert_fixture_data(stat, user_data, UserCount)
realm_data = { realm_data = {
website.id: self.generate_fixture_data(stat, 30, 20, 5, 0.6, 3), website.id: self.generate_fixture_data(stat, 30, 20, 5, .6, 3),
old_desktop.id: self.generate_fixture_data(stat, 5, 3, 8, 0.6, 3), old_desktop.id: self.generate_fixture_data(stat, 5, 3, 8, .6, 3),
android.id: self.generate_fixture_data(stat, 5, 5, 2, 0.6, 3), android.id: self.generate_fixture_data(stat, 5, 5, 2, .6, 3),
iOS.id: self.generate_fixture_data(stat, 5, 5, 2, 0.6, 3), iOS.id: self.generate_fixture_data(stat, 5, 5, 2, .6, 3),
react_native.id: self.generate_fixture_data(stat, 5, 5, 10, 0.6, 3), react_native.id: self.generate_fixture_data(stat, 5, 5, 10, .6, 3),
API.id: self.generate_fixture_data(stat, 5, 5, 5, 0.6, 3), API.id: self.generate_fixture_data(stat, 5, 5, 5, .6, 3),
zephyr_mirror.id: self.generate_fixture_data(stat, 1, 1, 3, 0.6, 3), zephyr_mirror.id: self.generate_fixture_data(stat, 1, 1, 3, .6, 3),
unused.id: self.generate_fixture_data(stat, 0, 0, 0, 0, 0), unused.id: self.generate_fixture_data(stat, 0, 0, 0, 0, 0),
long_webhook.id: self.generate_fixture_data(stat, 5, 5, 2, 0.6, 3), long_webhook.id: self.generate_fixture_data(stat, 5, 5, 2, .6, 3)}
}
insert_fixture_data(stat, realm_data, RealmCount) insert_fixture_data(stat, realm_data, RealmCount)
installation_data = { installation_data = {
website.id: self.generate_fixture_data(stat, 300, 200, 5, 0.6, 3), website.id: self.generate_fixture_data(stat, 300, 200, 5, .6, 3),
old_desktop.id: self.generate_fixture_data(stat, 50, 30, 8, 0.6, 3), old_desktop.id: self.generate_fixture_data(stat, 50, 30, 8, .6, 3),
android.id: self.generate_fixture_data(stat, 50, 50, 2, 0.6, 3), android.id: self.generate_fixture_data(stat, 50, 50, 2, .6, 3),
iOS.id: self.generate_fixture_data(stat, 50, 50, 2, 0.6, 3), iOS.id: self.generate_fixture_data(stat, 50, 50, 2, .6, 3),
react_native.id: self.generate_fixture_data(stat, 5, 5, 10, 0.6, 3), react_native.id: self.generate_fixture_data(stat, 5, 5, 10, .6, 3),
API.id: self.generate_fixture_data(stat, 50, 50, 5, 0.6, 3), API.id: self.generate_fixture_data(stat, 50, 50, 5, .6, 3),
zephyr_mirror.id: self.generate_fixture_data(stat, 10, 10, 3, 0.6, 3), zephyr_mirror.id: self.generate_fixture_data(stat, 10, 10, 3, .6, 3),
unused.id: self.generate_fixture_data(stat, 0, 0, 0, 0, 0), unused.id: self.generate_fixture_data(stat, 0, 0, 0, 0, 0),
long_webhook.id: self.generate_fixture_data(stat, 50, 50, 2, 0.6, 3), long_webhook.id: self.generate_fixture_data(stat, 50, 50, 2, .6, 3)}
}
insert_fixture_data(stat, installation_data, InstallationCount) insert_fixture_data(stat, installation_data, InstallationCount)
FillState.objects.create( FillState.objects.create(property=stat.property, end_time=last_end_time,
property=stat.property, end_time=last_end_time, state=FillState.DONE state=FillState.DONE)
)
stat = COUNT_STATS["messages_in_stream:is_bot:day"] stat = COUNT_STATS['messages_in_stream:is_bot:day']
realm_data = { realm_data = {'false': self.generate_fixture_data(stat, 30, 5, 6, .6, 4),
"false": self.generate_fixture_data(stat, 30, 5, 6, 0.6, 4), 'true': self.generate_fixture_data(stat, 20, 2, 3, .2, 3)}
"true": self.generate_fixture_data(stat, 20, 2, 3, 0.2, 3),
}
insert_fixture_data(stat, realm_data, RealmCount) insert_fixture_data(stat, realm_data, RealmCount)
stream_data: Mapping[Optional[str], List[int]] = { stream_data = {'false': self.generate_fixture_data(stat, 10, 7, 5, .6, 4),
"false": self.generate_fixture_data(stat, 10, 7, 5, 0.6, 4), 'true': self.generate_fixture_data(stat, 5, 3, 2, .4, 2)} # type: Mapping[Optional[str], List[int]]
"true": self.generate_fixture_data(stat, 5, 3, 2, 0.4, 2),
}
insert_fixture_data(stat, stream_data, StreamCount) insert_fixture_data(stat, stream_data, StreamCount)
FillState.objects.create( FillState.objects.create(property=stat.property, end_time=last_end_time,
property=stat.property, end_time=last_end_time, state=FillState.DONE state=FillState.DONE)
)
stat = COUNT_STATS["messages_read::hour"]
user_data = {
None: self.generate_fixture_data(stat, 7, 3, 2, 0.6, 8, holiday_rate=0.1),
}
insert_fixture_data(stat, user_data, UserCount)
realm_data = {None: self.generate_fixture_data(stat, 50, 35, 6, 0.6, 4)}
insert_fixture_data(stat, realm_data, RealmCount)
FillState.objects.create(
property=stat.property, end_time=last_end_time, state=FillState.DONE
)

View File

@@ -0,0 +1,152 @@
import datetime
from argparse import ArgumentParser
from typing import Any, List
from django.core.management.base import BaseCommand
from django.db.models import Count
from django.utils.timezone import now as timezone_now
from zerver.models import Message, Realm, Recipient, Stream, \
Subscription, UserActivity, UserMessage, UserProfile, get_realm
MOBILE_CLIENT_LIST = ["Android", "ios"]
HUMAN_CLIENT_LIST = MOBILE_CLIENT_LIST + ["website"]
human_messages = Message.objects.filter(sending_client__name__in=HUMAN_CLIENT_LIST)
class Command(BaseCommand):
help = "Generate statistics on realm activity."
def add_arguments(self, parser: ArgumentParser) -> None:
parser.add_argument('realms', metavar='<realm>', type=str, nargs='*',
help="realm to generate statistics for")
def active_users(self, realm: Realm) -> List[UserProfile]:
# Has been active (on the website, for now) in the last 7 days.
activity_cutoff = timezone_now() - datetime.timedelta(days=7)
return [activity.user_profile for activity in (
UserActivity.objects.filter(user_profile__realm=realm,
user_profile__is_active=True,
last_visit__gt=activity_cutoff,
query="/json/users/me/pointer",
client__name="website"))]
def messages_sent_by(self, user: UserProfile, days_ago: int) -> int:
sent_time_cutoff = timezone_now() - datetime.timedelta(days=days_ago)
return human_messages.filter(sender=user, pub_date__gt=sent_time_cutoff).count()
def total_messages(self, realm: Realm, days_ago: int) -> int:
sent_time_cutoff = timezone_now() - datetime.timedelta(days=days_ago)
return Message.objects.filter(sender__realm=realm, pub_date__gt=sent_time_cutoff).count()
def human_messages(self, realm: Realm, days_ago: int) -> int:
sent_time_cutoff = timezone_now() - datetime.timedelta(days=days_ago)
return human_messages.filter(sender__realm=realm, pub_date__gt=sent_time_cutoff).count()
def api_messages(self, realm: Realm, days_ago: int) -> int:
return (self.total_messages(realm, days_ago) - self.human_messages(realm, days_ago))
def stream_messages(self, realm: Realm, days_ago: int) -> int:
sent_time_cutoff = timezone_now() - datetime.timedelta(days=days_ago)
return human_messages.filter(sender__realm=realm, pub_date__gt=sent_time_cutoff,
recipient__type=Recipient.STREAM).count()
def private_messages(self, realm: Realm, days_ago: int) -> int:
sent_time_cutoff = timezone_now() - datetime.timedelta(days=days_ago)
return human_messages.filter(sender__realm=realm, pub_date__gt=sent_time_cutoff).exclude(
recipient__type=Recipient.STREAM).exclude(recipient__type=Recipient.HUDDLE).count()
def group_private_messages(self, realm: Realm, days_ago: int) -> int:
sent_time_cutoff = timezone_now() - datetime.timedelta(days=days_ago)
return human_messages.filter(sender__realm=realm, pub_date__gt=sent_time_cutoff).exclude(
recipient__type=Recipient.STREAM).exclude(recipient__type=Recipient.PERSONAL).count()
def report_percentage(self, numerator: float, denominator: float, text: str) -> None:
if not denominator:
fraction = 0.0
else:
fraction = numerator / float(denominator)
print("%.2f%% of" % (fraction * 100,), text)
def handle(self, *args: Any, **options: Any) -> None:
if options['realms']:
try:
realms = [get_realm(string_id) for string_id in options['realms']]
except Realm.DoesNotExist as e:
print(e)
exit(1)
else:
realms = Realm.objects.all()
for realm in realms:
print(realm.string_id)
user_profiles = UserProfile.objects.filter(realm=realm, is_active=True)
active_users = self.active_users(realm)
num_active = len(active_users)
print("%d active users (%d total)" % (num_active, len(user_profiles)))
streams = Stream.objects.filter(realm=realm).extra(
tables=['zerver_subscription', 'zerver_recipient'],
where=['zerver_subscription.recipient_id = zerver_recipient.id',
'zerver_recipient.type = 2',
'zerver_recipient.type_id = zerver_stream.id',
'zerver_subscription.active = true']).annotate(count=Count("name"))
print("%d streams" % (streams.count(),))
for days_ago in (1, 7, 30):
print("In last %d days, users sent:" % (days_ago,))
sender_quantities = [self.messages_sent_by(user, days_ago) for user in user_profiles]
for quantity in sorted(sender_quantities, reverse=True):
print(quantity, end=' ')
print("")
print("%d stream messages" % (self.stream_messages(realm, days_ago),))
print("%d one-on-one private messages" % (self.private_messages(realm, days_ago),))
print("%d messages sent via the API" % (self.api_messages(realm, days_ago),))
print("%d group private messages" % (self.group_private_messages(realm, days_ago),))
num_notifications_enabled = len([x for x in active_users if x.enable_desktop_notifications])
self.report_percentage(num_notifications_enabled, num_active,
"active users have desktop notifications enabled")
num_enter_sends = len([x for x in active_users if x.enter_sends])
self.report_percentage(num_enter_sends, num_active,
"active users have enter-sends")
all_message_count = human_messages.filter(sender__realm=realm).count()
multi_paragraph_message_count = human_messages.filter(
sender__realm=realm, content__contains="\n\n").count()
self.report_percentage(multi_paragraph_message_count, all_message_count,
"all messages are multi-paragraph")
# Starred messages
starrers = UserMessage.objects.filter(user_profile__in=user_profiles,
flags=UserMessage.flags.starred).values(
"user_profile").annotate(count=Count("user_profile"))
print("%d users have starred %d messages" % (
len(starrers), sum([elt["count"] for elt in starrers])))
active_user_subs = Subscription.objects.filter(
user_profile__in=user_profiles, active=True)
# Streams not in home view
non_home_view = active_user_subs.filter(in_home_view=False).values(
"user_profile").annotate(count=Count("user_profile"))
print("%d users have %d streams not in home view" % (
len(non_home_view), sum([elt["count"] for elt in non_home_view])))
# Code block markup
markup_messages = human_messages.filter(
sender__realm=realm, content__contains="~~~").values(
"sender").annotate(count=Count("sender"))
print("%d users have used code block markup on %s messages" % (
len(markup_messages), sum([elt["count"] for elt in markup_messages])))
# Notifications for stream messages
notifications = active_user_subs.filter(desktop_notifications=True).values(
"user_profile").annotate(count=Count("user_profile"))
print("%d users receive desktop notifications for %d streams" % (
len(notifications), sum([elt["count"] for elt in notifications])))
print("")

View File

@@ -1,26 +1,26 @@
from argparse import ArgumentParser from argparse import ArgumentParser
from typing import Any from typing import Any
from django.core.management.base import BaseCommand, CommandError from django.core.management.base import BaseCommand
from django.db.models import Q from django.db.models import Q
from zerver.models import Message, Realm, Recipient, Stream, Subscription, get_realm from zerver.models import Message, Realm, \
Recipient, Stream, Subscription, get_realm
class Command(BaseCommand): class Command(BaseCommand):
help = "Generate statistics on the streams for a realm." help = "Generate statistics on the streams for a realm."
def add_arguments(self, parser: ArgumentParser) -> None: def add_arguments(self, parser: ArgumentParser) -> None:
parser.add_argument( parser.add_argument('realms', metavar='<realm>', type=str, nargs='*',
"realms", metavar="<realm>", nargs="*", help="realm to generate statistics for" help="realm to generate statistics for")
)
def handle(self, *args: Any, **options: str) -> None: def handle(self, *args: Any, **options: str) -> None:
if options["realms"]: if options['realms']:
try: try:
realms = [get_realm(string_id) for string_id in options["realms"]] realms = [get_realm(string_id) for string_id in options['realms']]
except Realm.DoesNotExist as e: except Realm.DoesNotExist as e:
raise CommandError(e) print(e)
exit(1)
else: else:
realms = Realm.objects.all() realms = Realm.objects.all()
@@ -36,26 +36,22 @@ class Command(BaseCommand):
else: else:
public_count += 1 public_count += 1
print("------------") print("------------")
print(realm.string_id, end=" ") print(realm.string_id, end=' ')
print("{:>10} {} public streams and".format("(", public_count), end=" ") print("%10s %d public streams and" % ("(", public_count), end=' ')
print(f"{private_count} private streams )") print("%d private streams )" % (private_count,))
print("------------") print("------------")
print("{:>25} {:>15} {:>10} {:>12}".format("stream", "subscribers", "messages", "type")) print("%25s %15s %10s %12s" % ("stream", "subscribers", "messages", "type"))
for stream in streams: for stream in streams:
if stream.invite_only: if stream.invite_only:
stream_type = "private" stream_type = 'private'
else: else:
stream_type = "public" stream_type = 'public'
print(f"{stream.name:>25}", end=" ") print("%25s" % (stream.name,), end=' ')
recipient = Recipient.objects.filter(type=Recipient.STREAM, type_id=stream.id) recipient = Recipient.objects.filter(type=Recipient.STREAM, type_id=stream.id)
print( print("%10d" % (len(Subscription.objects.filter(recipient=recipient,
"{:10}".format( active=True)),), end=' ')
len(Subscription.objects.filter(recipient=recipient, active=True))
),
end=" ",
)
num_messages = len(Message.objects.filter(recipient=recipient)) num_messages = len(Message.objects.filter(recipient=recipient))
print(f"{num_messages:12}", end=" ") print("%12d" % (num_messages,), end=' ')
print(f"{stream_type:>15}") print("%15s" % (stream_type,))
print("") print("")

View File

@@ -1,13 +1,13 @@
import os import os
import time import time
from argparse import ArgumentParser from argparse import ArgumentParser
from datetime import timezone
from typing import Any, Dict from typing import Any, Dict
from django.conf import settings from django.conf import settings
from django.core.management.base import BaseCommand from django.core.management.base import BaseCommand
from django.utils.dateparse import parse_datetime from django.utils.dateparse import parse_datetime
from django.utils.timezone import now as timezone_now from django.utils.timezone import now as timezone_now
from django.utils.timezone import utc as timezone_utc
from analytics.lib.counts import COUNT_STATS, logger, process_count_stat from analytics.lib.counts import COUNT_STATS, logger, process_count_stat
from scripts.lib.zulip_tools import ENDC, WARNING from scripts.lib.zulip_tools import ENDC, WARNING
@@ -15,36 +15,34 @@ from zerver.lib.remote_server import send_analytics_to_remote_server
from zerver.lib.timestamp import floor_to_hour from zerver.lib.timestamp import floor_to_hour
from zerver.models import Realm from zerver.models import Realm
class Command(BaseCommand): class Command(BaseCommand):
help = """Fills Analytics tables. help = """Fills Analytics tables.
Run as a cron job that runs every hour.""" Run as a cron job that runs every hour."""
def add_arguments(self, parser: ArgumentParser) -> None: def add_arguments(self, parser: ArgumentParser) -> None:
parser.add_argument( parser.add_argument('--time', '-t',
"--time", type=str,
"-t", help='Update stat tables from current state to'
help="Update stat tables from current state to " '--time. Defaults to the current time.',
"--time. Defaults to the current time.", default=timezone_now().isoformat())
default=timezone_now().isoformat(), parser.add_argument('--utc',
) action='store_true',
parser.add_argument("--utc", action="store_true", help="Interpret --time in UTC.") help="Interpret --time in UTC.",
parser.add_argument( default=False)
"--stat", "-s", help="CountStat to process. If omitted, all stats are processed." parser.add_argument('--stat', '-s',
) type=str,
parser.add_argument( help="CountStat to process. If omitted, all stats are processed.")
"--verbose", action="store_true", help="Print timing information to stdout." parser.add_argument('--verbose',
) action='store_true',
help="Print timing information to stdout.",
default=False)
def handle(self, *args: Any, **options: Any) -> None: def handle(self, *args: Any, **options: Any) -> None:
try: try:
os.mkdir(settings.ANALYTICS_LOCK_DIR) os.mkdir(settings.ANALYTICS_LOCK_DIR)
except OSError: except OSError:
print( print(WARNING + "Analytics lock %s is unavailable; exiting... " + ENDC)
f"{WARNING}Analytics lock {settings.ANALYTICS_LOCK_DIR} is unavailable;"
f" exiting.{ENDC}"
)
return return
try: try:
@@ -59,37 +57,34 @@ class Command(BaseCommand):
logger.info("No realms, stopping update_analytics_counts") logger.info("No realms, stopping update_analytics_counts")
return return
fill_to_time = parse_datetime(options["time"]) fill_to_time = parse_datetime(options['time'])
if options["utc"]: if options['utc']:
fill_to_time = fill_to_time.replace(tzinfo=timezone.utc) fill_to_time = fill_to_time.replace(tzinfo=timezone_utc)
if fill_to_time.tzinfo is None: if fill_to_time.tzinfo is None:
raise ValueError( raise ValueError("--time must be timezone aware. Maybe you meant to use the --utc option?")
"--time must be timezone aware. Maybe you meant to use the --utc option?"
)
fill_to_time = floor_to_hour(fill_to_time.astimezone(timezone.utc)) fill_to_time = floor_to_hour(fill_to_time.astimezone(timezone_utc))
if options["stat"] is not None: if options['stat'] is not None:
stats = [COUNT_STATS[options["stat"]]] stats = [COUNT_STATS[options['stat']]]
else: else:
stats = list(COUNT_STATS.values()) stats = list(COUNT_STATS.values())
logger.info("Starting updating analytics counts through %s", fill_to_time) logger.info("Starting updating analytics counts through %s" % (fill_to_time,))
if options["verbose"]: if options['verbose']:
start = time.time() start = time.time()
last = start last = start
for stat in stats: for stat in stats:
process_count_stat(stat, fill_to_time) process_count_stat(stat, fill_to_time)
if options["verbose"]: if options['verbose']:
print(f"Updated {stat.property} in {time.time() - last:.3f}s") print("Updated %s in %.3fs" % (stat.property, time.time() - last))
last = time.time() last = time.time()
if options["verbose"]: if options['verbose']:
print( print("Finished updating analytics counts through %s in %.3fs" %
f"Finished updating analytics counts through {fill_to_time} in {time.time() - start:.3f}s" (fill_to_time, time.time() - start))
) logger.info("Finished updating analytics counts through %s" % (fill_to_time,))
logger.info("Finished updating analytics counts through %s", fill_to_time)
if settings.PUSH_NOTIFICATION_BOUNCER_URL and settings.SUBMIT_USAGE_STATISTICS: if settings.PUSH_NOTIFICATION_BOUNCER_URL and settings.SUBMIT_USAGE_STATISTICS:
send_analytics_to_remote_server() send_analytics_to_remote_server()

View File

@@ -0,0 +1,42 @@
import datetime
from argparse import ArgumentParser
from typing import Any
from django.core.management.base import BaseCommand
from django.utils.timezone import now as timezone_now
from zerver.models import Message, Realm, Stream, UserProfile, get_realm
class Command(BaseCommand):
help = "Generate statistics on user activity."
def add_arguments(self, parser: ArgumentParser) -> None:
parser.add_argument('realms', metavar='<realm>', type=str, nargs='*',
help="realm to generate statistics for")
def messages_sent_by(self, user: UserProfile, week: int) -> int:
start = timezone_now() - datetime.timedelta(days=(week + 1)*7)
end = timezone_now() - datetime.timedelta(days=week*7)
return Message.objects.filter(sender=user, pub_date__gt=start, pub_date__lte=end).count()
def handle(self, *args: Any, **options: Any) -> None:
if options['realms']:
try:
realms = [get_realm(string_id) for string_id in options['realms']]
except Realm.DoesNotExist as e:
print(e)
exit(1)
else:
realms = Realm.objects.all()
for realm in realms:
print(realm.string_id)
user_profiles = UserProfile.objects.filter(realm=realm, is_active=True)
print("%d users" % (len(user_profiles),))
print("%d streams" % (len(Stream.objects.filter(realm=realm)),))
for user_profile in user_profiles:
print("%35s" % (user_profile.email,), end=' ')
for week in range(10):
print("%5d" % (self.messages_sent_by(user_profile, week)), end=' ')
print("")

View File

@@ -1,209 +1,110 @@
# -*- coding: utf-8 -*-
import django.db.models.deletion import django.db.models.deletion
from django.conf import settings from django.conf import settings
from django.db import migrations, models from django.db import migrations, models
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [ dependencies = [
("zerver", "0030_realm_org_type"), ('zerver', '0030_realm_org_type'),
migrations.swappable_dependency(settings.AUTH_USER_MODEL), migrations.swappable_dependency(settings.AUTH_USER_MODEL),
] ]
operations = [ operations = [
migrations.CreateModel( migrations.CreateModel(
name="Anomaly", name='Anomaly',
fields=[ fields=[
( ('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
"id", ('info', models.CharField(max_length=1000)),
models.AutoField(
verbose_name="ID", serialize=False, auto_created=True, primary_key=True
),
),
("info", models.CharField(max_length=1000)),
], ],
bases=(models.Model,), bases=(models.Model,),
), ),
migrations.CreateModel( migrations.CreateModel(
name="HuddleCount", name='HuddleCount',
fields=[ fields=[
( ('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
"id", ('huddle', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='zerver.Recipient')),
models.AutoField( ('user', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
verbose_name="ID", serialize=False, auto_created=True, primary_key=True ('property', models.CharField(max_length=40)),
), ('end_time', models.DateTimeField()),
), ('interval', models.CharField(max_length=20)),
( ('value', models.BigIntegerField()),
"huddle", ('anomaly', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='analytics.Anomaly', null=True)),
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="zerver.Recipient"
),
),
(
"user",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL
),
),
("property", models.CharField(max_length=40)),
("end_time", models.DateTimeField()),
("interval", models.CharField(max_length=20)),
("value", models.BigIntegerField()),
(
"anomaly",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to="analytics.Anomaly",
null=True,
),
),
], ],
bases=(models.Model,), bases=(models.Model,),
), ),
migrations.CreateModel( migrations.CreateModel(
name="InstallationCount", name='InstallationCount',
fields=[ fields=[
( ('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
"id", ('property', models.CharField(max_length=40)),
models.AutoField( ('end_time', models.DateTimeField()),
verbose_name="ID", serialize=False, auto_created=True, primary_key=True ('interval', models.CharField(max_length=20)),
), ('value', models.BigIntegerField()),
), ('anomaly', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='analytics.Anomaly', null=True)),
("property", models.CharField(max_length=40)),
("end_time", models.DateTimeField()),
("interval", models.CharField(max_length=20)),
("value", models.BigIntegerField()),
(
"anomaly",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to="analytics.Anomaly",
null=True,
),
),
], ],
bases=(models.Model,), bases=(models.Model,),
), ),
migrations.CreateModel( migrations.CreateModel(
name="RealmCount", name='RealmCount',
fields=[ fields=[
( ('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
"id", ('realm', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='zerver.Realm')),
models.AutoField( ('property', models.CharField(max_length=40)),
verbose_name="ID", serialize=False, auto_created=True, primary_key=True ('end_time', models.DateTimeField()),
), ('interval', models.CharField(max_length=20)),
), ('value', models.BigIntegerField()),
( ('anomaly', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='analytics.Anomaly', null=True)),
"realm",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="zerver.Realm"
),
),
("property", models.CharField(max_length=40)),
("end_time", models.DateTimeField()),
("interval", models.CharField(max_length=20)),
("value", models.BigIntegerField()),
(
"anomaly",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to="analytics.Anomaly",
null=True,
),
),
], ],
bases=(models.Model,), bases=(models.Model,),
), ),
migrations.CreateModel( migrations.CreateModel(
name="StreamCount", name='StreamCount',
fields=[ fields=[
( ('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
"id", ('realm', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='zerver.Realm')),
models.AutoField( ('stream', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='zerver.Stream')),
verbose_name="ID", serialize=False, auto_created=True, primary_key=True ('property', models.CharField(max_length=40)),
), ('end_time', models.DateTimeField()),
), ('interval', models.CharField(max_length=20)),
( ('value', models.BigIntegerField()),
"realm", ('anomaly', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='analytics.Anomaly', null=True)),
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="zerver.Realm"
),
),
(
"stream",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="zerver.Stream"
),
),
("property", models.CharField(max_length=40)),
("end_time", models.DateTimeField()),
("interval", models.CharField(max_length=20)),
("value", models.BigIntegerField()),
(
"anomaly",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to="analytics.Anomaly",
null=True,
),
),
], ],
bases=(models.Model,), bases=(models.Model,),
), ),
migrations.CreateModel( migrations.CreateModel(
name="UserCount", name='UserCount',
fields=[ fields=[
( ('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
"id", ('realm', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='zerver.Realm')),
models.AutoField( ('user', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
verbose_name="ID", serialize=False, auto_created=True, primary_key=True ('property', models.CharField(max_length=40)),
), ('end_time', models.DateTimeField()),
), ('interval', models.CharField(max_length=20)),
( ('value', models.BigIntegerField()),
"realm", ('anomaly', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='analytics.Anomaly', null=True)),
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="zerver.Realm"
),
),
(
"user",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL
),
),
("property", models.CharField(max_length=40)),
("end_time", models.DateTimeField()),
("interval", models.CharField(max_length=20)),
("value", models.BigIntegerField()),
(
"anomaly",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to="analytics.Anomaly",
null=True,
),
),
], ],
bases=(models.Model,), bases=(models.Model,),
), ),
migrations.AlterUniqueTogether( migrations.AlterUniqueTogether(
name="usercount", name='usercount',
unique_together={("user", "property", "end_time", "interval")}, unique_together=set([('user', 'property', 'end_time', 'interval')]),
), ),
migrations.AlterUniqueTogether( migrations.AlterUniqueTogether(
name="streamcount", name='streamcount',
unique_together={("stream", "property", "end_time", "interval")}, unique_together=set([('stream', 'property', 'end_time', 'interval')]),
), ),
migrations.AlterUniqueTogether( migrations.AlterUniqueTogether(
name="realmcount", name='realmcount',
unique_together={("realm", "property", "end_time", "interval")}, unique_together=set([('realm', 'property', 'end_time', 'interval')]),
), ),
migrations.AlterUniqueTogether( migrations.AlterUniqueTogether(
name="installationcount", name='installationcount',
unique_together={("property", "end_time", "interval")}, unique_together=set([('property', 'end_time', 'interval')]),
), ),
migrations.AlterUniqueTogether( migrations.AlterUniqueTogether(
name="huddlecount", name='huddlecount',
unique_together={("huddle", "property", "end_time", "interval")}, unique_together=set([('huddle', 'property', 'end_time', 'interval')]),
), ),
] ]

View File

@@ -1,30 +1,30 @@
# -*- coding: utf-8 -*-
from django.db import migrations from django.db import migrations
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [ dependencies = [
("analytics", "0001_initial"), ('analytics', '0001_initial'),
] ]
operations = [ operations = [
migrations.AlterUniqueTogether( migrations.AlterUniqueTogether(
name="huddlecount", name='huddlecount',
unique_together=set(), unique_together=set([]),
), ),
migrations.RemoveField( migrations.RemoveField(
model_name="huddlecount", model_name='huddlecount',
name="anomaly", name='anomaly',
), ),
migrations.RemoveField( migrations.RemoveField(
model_name="huddlecount", model_name='huddlecount',
name="huddle", name='huddle',
), ),
migrations.RemoveField( migrations.RemoveField(
model_name="huddlecount", model_name='huddlecount',
name="user", name='user',
), ),
migrations.DeleteModel( migrations.DeleteModel(
name="HuddleCount", name='HuddleCount',
), ),
] ]

View File

@@ -1,26 +1,21 @@
# -*- coding: utf-8 -*-
from django.db import migrations, models from django.db import migrations, models
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [ dependencies = [
("analytics", "0002_remove_huddlecount"), ('analytics', '0002_remove_huddlecount'),
] ]
operations = [ operations = [
migrations.CreateModel( migrations.CreateModel(
name="FillState", name='FillState',
fields=[ fields=[
( ('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
"id", ('property', models.CharField(unique=True, max_length=40)),
models.AutoField( ('end_time', models.DateTimeField()),
verbose_name="ID", serialize=False, auto_created=True, primary_key=True ('state', models.PositiveSmallIntegerField()),
), ('last_modified', models.DateTimeField(auto_now=True)),
),
("property", models.CharField(unique=True, max_length=40)),
("end_time", models.DateTimeField()),
("state", models.PositiveSmallIntegerField()),
("last_modified", models.DateTimeField(auto_now=True)),
], ],
bases=(models.Model,), bases=(models.Model,),
), ),

View File

@@ -1,31 +1,31 @@
# -*- coding: utf-8 -*-
from django.db import migrations, models from django.db import migrations, models
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [ dependencies = [
("analytics", "0003_fillstate"), ('analytics', '0003_fillstate'),
] ]
operations = [ operations = [
migrations.AddField( migrations.AddField(
model_name="installationcount", model_name='installationcount',
name="subgroup", name='subgroup',
field=models.CharField(max_length=16, null=True), field=models.CharField(max_length=16, null=True),
), ),
migrations.AddField( migrations.AddField(
model_name="realmcount", model_name='realmcount',
name="subgroup", name='subgroup',
field=models.CharField(max_length=16, null=True), field=models.CharField(max_length=16, null=True),
), ),
migrations.AddField( migrations.AddField(
model_name="streamcount", model_name='streamcount',
name="subgroup", name='subgroup',
field=models.CharField(max_length=16, null=True), field=models.CharField(max_length=16, null=True),
), ),
migrations.AddField( migrations.AddField(
model_name="usercount", model_name='usercount',
name="subgroup", name='subgroup',
field=models.CharField(max_length=16, null=True), field=models.CharField(max_length=16, null=True),
), ),
] ]

View File

@@ -1,51 +1,51 @@
# -*- coding: utf-8 -*-
from django.db import migrations, models from django.db import migrations, models
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [ dependencies = [
("analytics", "0004_add_subgroup"), ('analytics', '0004_add_subgroup'),
] ]
operations = [ operations = [
migrations.AlterField( migrations.AlterField(
model_name="installationcount", model_name='installationcount',
name="interval", name='interval',
field=models.CharField(max_length=8), field=models.CharField(max_length=8),
), ),
migrations.AlterField( migrations.AlterField(
model_name="installationcount", model_name='installationcount',
name="property", name='property',
field=models.CharField(max_length=32), field=models.CharField(max_length=32),
), ),
migrations.AlterField( migrations.AlterField(
model_name="realmcount", model_name='realmcount',
name="interval", name='interval',
field=models.CharField(max_length=8), field=models.CharField(max_length=8),
), ),
migrations.AlterField( migrations.AlterField(
model_name="realmcount", model_name='realmcount',
name="property", name='property',
field=models.CharField(max_length=32), field=models.CharField(max_length=32),
), ),
migrations.AlterField( migrations.AlterField(
model_name="streamcount", model_name='streamcount',
name="interval", name='interval',
field=models.CharField(max_length=8), field=models.CharField(max_length=8),
), ),
migrations.AlterField( migrations.AlterField(
model_name="streamcount", model_name='streamcount',
name="property", name='property',
field=models.CharField(max_length=32), field=models.CharField(max_length=32),
), ),
migrations.AlterField( migrations.AlterField(
model_name="usercount", model_name='usercount',
name="interval", name='interval',
field=models.CharField(max_length=8), field=models.CharField(max_length=8),
), ),
migrations.AlterField( migrations.AlterField(
model_name="usercount", model_name='usercount',
name="property", name='property',
field=models.CharField(max_length=32), field=models.CharField(max_length=32),
), ),
] ]

View File

@@ -1,27 +1,27 @@
# -*- coding: utf-8 -*-
from django.db import migrations from django.db import migrations
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [ dependencies = [
("analytics", "0005_alter_field_size"), ('analytics', '0005_alter_field_size'),
] ]
operations = [ operations = [
migrations.AlterUniqueTogether( migrations.AlterUniqueTogether(
name="installationcount", name='installationcount',
unique_together={("property", "subgroup", "end_time", "interval")}, unique_together=set([('property', 'subgroup', 'end_time', 'interval')]),
), ),
migrations.AlterUniqueTogether( migrations.AlterUniqueTogether(
name="realmcount", name='realmcount',
unique_together={("realm", "property", "subgroup", "end_time", "interval")}, unique_together=set([('realm', 'property', 'subgroup', 'end_time', 'interval')]),
), ),
migrations.AlterUniqueTogether( migrations.AlterUniqueTogether(
name="streamcount", name='streamcount',
unique_together={("stream", "property", "subgroup", "end_time", "interval")}, unique_together=set([('stream', 'property', 'subgroup', 'end_time', 'interval')]),
), ),
migrations.AlterUniqueTogether( migrations.AlterUniqueTogether(
name="usercount", name='usercount',
unique_together={("user", "property", "subgroup", "end_time", "interval")}, unique_together=set([('user', 'property', 'subgroup', 'end_time', 'interval')]),
), ),
] ]

View File

@@ -1,44 +1,44 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.10.4 on 2017-01-16 20:50 # Generated by Django 1.10.4 on 2017-01-16 20:50
from django.db import migrations from django.db import migrations
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [ dependencies = [
("analytics", "0006_add_subgroup_to_unique_constraints"), ('analytics', '0006_add_subgroup_to_unique_constraints'),
] ]
operations = [ operations = [
migrations.AlterUniqueTogether( migrations.AlterUniqueTogether(
name="installationcount", name='installationcount',
unique_together={("property", "subgroup", "end_time")}, unique_together=set([('property', 'subgroup', 'end_time')]),
), ),
migrations.RemoveField( migrations.RemoveField(
model_name="installationcount", model_name='installationcount',
name="interval", name='interval',
), ),
migrations.AlterUniqueTogether( migrations.AlterUniqueTogether(
name="realmcount", name='realmcount',
unique_together={("realm", "property", "subgroup", "end_time")}, unique_together=set([('realm', 'property', 'subgroup', 'end_time')]),
), ),
migrations.RemoveField( migrations.RemoveField(
model_name="realmcount", model_name='realmcount',
name="interval", name='interval',
), ),
migrations.AlterUniqueTogether( migrations.AlterUniqueTogether(
name="streamcount", name='streamcount',
unique_together={("stream", "property", "subgroup", "end_time")}, unique_together=set([('stream', 'property', 'subgroup', 'end_time')]),
), ),
migrations.RemoveField( migrations.RemoveField(
model_name="streamcount", model_name='streamcount',
name="interval", name='interval',
), ),
migrations.AlterUniqueTogether( migrations.AlterUniqueTogether(
name="usercount", name='usercount',
unique_together={("user", "property", "subgroup", "end_time")}, unique_together=set([('user', 'property', 'subgroup', 'end_time')]),
), ),
migrations.RemoveField( migrations.RemoveField(
model_name="usercount", model_name='usercount',
name="interval", name='interval',
), ),
] ]

View File

@@ -1,25 +1,25 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.10.5 on 2017-02-01 22:28 # Generated by Django 1.10.5 on 2017-02-01 22:28
from django.db import migrations from django.db import migrations
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [ dependencies = [
("zerver", "0050_userprofile_avatar_version"), ('zerver', '0050_userprofile_avatar_version'),
("analytics", "0007_remove_interval"), ('analytics', '0007_remove_interval'),
] ]
operations = [ operations = [
migrations.AlterIndexTogether( migrations.AlterIndexTogether(
name="realmcount", name='realmcount',
index_together={("property", "end_time")}, index_together=set([('property', 'end_time')]),
), ),
migrations.AlterIndexTogether( migrations.AlterIndexTogether(
name="streamcount", name='streamcount',
index_together={("property", "realm", "end_time")}, index_together=set([('property', 'realm', 'end_time')]),
), ),
migrations.AlterIndexTogether( migrations.AlterIndexTogether(
name="usercount", name='usercount',
index_together={("property", "realm", "end_time")}, index_together=set([('property', 'realm', 'end_time')]),
), ),
] ]

View File

@@ -1,29 +1,26 @@
# -*- coding: utf-8 -*-
from django.db import migrations from django.db import migrations
from django.db.backends.postgresql.schema import DatabaseSchemaEditor from django.db.backends.postgresql_psycopg2.schema import DatabaseSchemaEditor
from django.db.migrations.state import StateApps from django.db.migrations.state import StateApps
def delete_messages_sent_to_stream_stat(apps: StateApps, schema_editor: DatabaseSchemaEditor) -> None:
UserCount = apps.get_model('analytics', 'UserCount')
StreamCount = apps.get_model('analytics', 'StreamCount')
RealmCount = apps.get_model('analytics', 'RealmCount')
InstallationCount = apps.get_model('analytics', 'InstallationCount')
FillState = apps.get_model('analytics', 'FillState')
def delete_messages_sent_to_stream_stat( property = 'messages_sent_to_stream:is_bot'
apps: StateApps, schema_editor: DatabaseSchemaEditor
) -> None:
UserCount = apps.get_model("analytics", "UserCount")
StreamCount = apps.get_model("analytics", "StreamCount")
RealmCount = apps.get_model("analytics", "RealmCount")
InstallationCount = apps.get_model("analytics", "InstallationCount")
FillState = apps.get_model("analytics", "FillState")
property = "messages_sent_to_stream:is_bot"
UserCount.objects.filter(property=property).delete() UserCount.objects.filter(property=property).delete()
StreamCount.objects.filter(property=property).delete() StreamCount.objects.filter(property=property).delete()
RealmCount.objects.filter(property=property).delete() RealmCount.objects.filter(property=property).delete()
InstallationCount.objects.filter(property=property).delete() InstallationCount.objects.filter(property=property).delete()
FillState.objects.filter(property=property).delete() FillState.objects.filter(property=property).delete()
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [ dependencies = [
("analytics", "0008_add_count_indexes"), ('analytics', '0008_add_count_indexes'),
] ]
operations = [ operations = [

View File

@@ -1,28 +1,25 @@
# -*- coding: utf-8 -*-
from django.db import migrations from django.db import migrations
from django.db.backends.postgresql.schema import DatabaseSchemaEditor from django.db.backends.postgresql_psycopg2.schema import DatabaseSchemaEditor
from django.db.migrations.state import StateApps from django.db.migrations.state import StateApps
def clear_message_sent_by_message_type_values(apps: StateApps, schema_editor: DatabaseSchemaEditor) -> None:
UserCount = apps.get_model('analytics', 'UserCount')
StreamCount = apps.get_model('analytics', 'StreamCount')
RealmCount = apps.get_model('analytics', 'RealmCount')
InstallationCount = apps.get_model('analytics', 'InstallationCount')
FillState = apps.get_model('analytics', 'FillState')
def clear_message_sent_by_message_type_values( property = 'messages_sent:message_type:day'
apps: StateApps, schema_editor: DatabaseSchemaEditor
) -> None:
UserCount = apps.get_model("analytics", "UserCount")
StreamCount = apps.get_model("analytics", "StreamCount")
RealmCount = apps.get_model("analytics", "RealmCount")
InstallationCount = apps.get_model("analytics", "InstallationCount")
FillState = apps.get_model("analytics", "FillState")
property = "messages_sent:message_type:day"
UserCount.objects.filter(property=property).delete() UserCount.objects.filter(property=property).delete()
StreamCount.objects.filter(property=property).delete() StreamCount.objects.filter(property=property).delete()
RealmCount.objects.filter(property=property).delete() RealmCount.objects.filter(property=property).delete()
InstallationCount.objects.filter(property=property).delete() InstallationCount.objects.filter(property=property).delete()
FillState.objects.filter(property=property).delete() FillState.objects.filter(property=property).delete()
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [("analytics", "0009_remove_messages_to_stream_stat")] dependencies = [('analytics', '0009_remove_messages_to_stream_stat')]
operations = [ operations = [
migrations.RunPython(clear_message_sent_by_message_type_values), migrations.RunPython(clear_message_sent_by_message_type_values),

View File

@@ -1,14 +1,14 @@
# -*- coding: utf-8 -*-
from django.db import migrations from django.db import migrations
from django.db.backends.postgresql.schema import DatabaseSchemaEditor from django.db.backends.postgresql_psycopg2.schema import DatabaseSchemaEditor
from django.db.migrations.state import StateApps from django.db.migrations.state import StateApps
def clear_analytics_tables(apps: StateApps, schema_editor: DatabaseSchemaEditor) -> None: def clear_analytics_tables(apps: StateApps, schema_editor: DatabaseSchemaEditor) -> None:
UserCount = apps.get_model("analytics", "UserCount") UserCount = apps.get_model('analytics', 'UserCount')
StreamCount = apps.get_model("analytics", "StreamCount") StreamCount = apps.get_model('analytics', 'StreamCount')
RealmCount = apps.get_model("analytics", "RealmCount") RealmCount = apps.get_model('analytics', 'RealmCount')
InstallationCount = apps.get_model("analytics", "InstallationCount") InstallationCount = apps.get_model('analytics', 'InstallationCount')
FillState = apps.get_model("analytics", "FillState") FillState = apps.get_model('analytics', 'FillState')
UserCount.objects.all().delete() UserCount.objects.all().delete()
StreamCount.objects.all().delete() StreamCount.objects.all().delete()
@@ -16,11 +16,10 @@ def clear_analytics_tables(apps: StateApps, schema_editor: DatabaseSchemaEditor)
InstallationCount.objects.all().delete() InstallationCount.objects.all().delete()
FillState.objects.all().delete() FillState.objects.all().delete()
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [ dependencies = [
("analytics", "0010_clear_messages_sent_values"), ('analytics', '0010_clear_messages_sent_values'),
] ]
operations = [ operations = [

View File

@@ -1,42 +1,36 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.6 on 2018-01-29 08:14 # Generated by Django 1.11.6 on 2018-01-29 08:14
from __future__ import unicode_literals
import django.db.models.deletion
from django.db import migrations, models from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [ dependencies = [
("analytics", "0011_clear_analytics_tables"), ('analytics', '0011_clear_analytics_tables'),
] ]
operations = [ operations = [
migrations.AlterField( migrations.AlterField(
model_name="installationcount", model_name='installationcount',
name="anomaly", name='anomaly',
field=models.ForeignKey( field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, to='analytics.Anomaly'),
null=True, on_delete=django.db.models.deletion.SET_NULL, to="analytics.Anomaly"
),
), ),
migrations.AlterField( migrations.AlterField(
model_name="realmcount", model_name='realmcount',
name="anomaly", name='anomaly',
field=models.ForeignKey( field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, to='analytics.Anomaly'),
null=True, on_delete=django.db.models.deletion.SET_NULL, to="analytics.Anomaly"
),
), ),
migrations.AlterField( migrations.AlterField(
model_name="streamcount", model_name='streamcount',
name="anomaly", name='anomaly',
field=models.ForeignKey( field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, to='analytics.Anomaly'),
null=True, on_delete=django.db.models.deletion.SET_NULL, to="analytics.Anomaly"
),
), ),
migrations.AlterField( migrations.AlterField(
model_name="usercount", model_name='usercount',
name="anomaly", name='anomaly',
field=models.ForeignKey( field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, to='analytics.Anomaly'),
null=True, on_delete=django.db.models.deletion.SET_NULL, to="analytics.Anomaly"
),
), ),
] ]

View File

@@ -1,4 +1,6 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.18 on 2019-02-02 02:47 # Generated by Django 1.11.18 on 2019-02-02 02:47
from __future__ import unicode_literals
from django.db import migrations from django.db import migrations
@@ -6,27 +8,27 @@ from django.db import migrations
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [ dependencies = [
("analytics", "0012_add_on_delete"), ('analytics', '0012_add_on_delete'),
] ]
operations = [ operations = [
migrations.RemoveField( migrations.RemoveField(
model_name="installationcount", model_name='installationcount',
name="anomaly", name='anomaly',
), ),
migrations.RemoveField( migrations.RemoveField(
model_name="realmcount", model_name='realmcount',
name="anomaly", name='anomaly',
), ),
migrations.RemoveField( migrations.RemoveField(
model_name="streamcount", model_name='streamcount',
name="anomaly", name='anomaly',
), ),
migrations.RemoveField( migrations.RemoveField(
model_name="usercount", model_name='usercount',
name="anomaly", name='anomaly',
), ),
migrations.DeleteModel( migrations.DeleteModel(
name="Anomaly", name='Anomaly',
), ),
] ]

View File

@@ -1,17 +0,0 @@
# Generated by Django 1.11.26 on 2020-01-27 04:32
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("analytics", "0013_remove_anomaly"),
]
operations = [
migrations.RemoveField(
model_name="fillstate",
name="last_modified",
),
]

View File

@@ -1,65 +0,0 @@
from django.db import migrations
from django.db.backends.postgresql.schema import DatabaseSchemaEditor
from django.db.migrations.state import StateApps
from django.db.models import Count, Sum
def clear_duplicate_counts(apps: StateApps, schema_editor: DatabaseSchemaEditor) -> None:
"""This is a preparatory migration for our Analytics tables.
The backstory is that Django's unique_together indexes do not properly
handle the subgroup=None corner case (allowing duplicate rows that have a
subgroup of None), which meant that in race conditions, rather than updating
an existing row for the property/(realm, stream, user)/time with subgroup=None, Django would
create a duplicate row.
In the next migration, we'll add a proper constraint to fix this bug, but
we need to fix any existing problematic rows before we can add that constraint.
We fix this in an appropriate fashion for each type of CountStat object; mainly
this means deleting the extra rows, but for LoggingCountStat objects, we need to
additionally combine the sums.
"""
count_tables = dict(
realm=apps.get_model("analytics", "RealmCount"),
user=apps.get_model("analytics", "UserCount"),
stream=apps.get_model("analytics", "StreamCount"),
installation=apps.get_model("analytics", "InstallationCount"),
)
for name, count_table in count_tables.items():
value = [name, "property", "end_time"]
if name == "installation":
value = ["property", "end_time"]
counts = (
count_table.objects.filter(subgroup=None)
.values(*value)
.annotate(Count("id"), Sum("value"))
.filter(id__count__gt=1)
)
for count in counts:
count.pop("id__count")
total_value = count.pop("value__sum")
duplicate_counts = list(count_table.objects.filter(**count))
first_count = duplicate_counts[0]
if count["property"] in ["invites_sent::day", "active_users_log:is_bot:day"]:
# For LoggingCountStat objects, the right fix is to combine the totals;
# for other CountStat objects, we expect the duplicates to have the same value.
# And so all we need to do is delete them.
first_count.value = total_value
first_count.save()
to_cleanup = duplicate_counts[1:]
for duplicate_count in to_cleanup:
duplicate_count.delete()
class Migration(migrations.Migration):
dependencies = [
("analytics", "0014_remove_fillstate_last_modified"),
]
operations = [
migrations.RunPython(clear_duplicate_counts, reverse_code=migrations.RunPython.noop),
]

View File

@@ -1,93 +0,0 @@
# Generated by Django 2.2.10 on 2020-02-29 19:40
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("analytics", "0015_clear_duplicate_counts"),
]
operations = [
migrations.AlterUniqueTogether(
name="installationcount",
unique_together=set(),
),
migrations.AlterUniqueTogether(
name="realmcount",
unique_together=set(),
),
migrations.AlterUniqueTogether(
name="streamcount",
unique_together=set(),
),
migrations.AlterUniqueTogether(
name="usercount",
unique_together=set(),
),
migrations.AddConstraint(
model_name="installationcount",
constraint=models.UniqueConstraint(
condition=models.Q(subgroup__isnull=False),
fields=("property", "subgroup", "end_time"),
name="unique_installation_count",
),
),
migrations.AddConstraint(
model_name="installationcount",
constraint=models.UniqueConstraint(
condition=models.Q(subgroup__isnull=True),
fields=("property", "end_time"),
name="unique_installation_count_null_subgroup",
),
),
migrations.AddConstraint(
model_name="realmcount",
constraint=models.UniqueConstraint(
condition=models.Q(subgroup__isnull=False),
fields=("realm", "property", "subgroup", "end_time"),
name="unique_realm_count",
),
),
migrations.AddConstraint(
model_name="realmcount",
constraint=models.UniqueConstraint(
condition=models.Q(subgroup__isnull=True),
fields=("realm", "property", "end_time"),
name="unique_realm_count_null_subgroup",
),
),
migrations.AddConstraint(
model_name="streamcount",
constraint=models.UniqueConstraint(
condition=models.Q(subgroup__isnull=False),
fields=("stream", "property", "subgroup", "end_time"),
name="unique_stream_count",
),
),
migrations.AddConstraint(
model_name="streamcount",
constraint=models.UniqueConstraint(
condition=models.Q(subgroup__isnull=True),
fields=("stream", "property", "end_time"),
name="unique_stream_count_null_subgroup",
),
),
migrations.AddConstraint(
model_name="usercount",
constraint=models.UniqueConstraint(
condition=models.Q(subgroup__isnull=False),
fields=("user", "property", "subgroup", "end_time"),
name="unique_user_count",
),
),
migrations.AddConstraint(
model_name="usercount",
constraint=models.UniqueConstraint(
condition=models.Q(subgroup__isnull=True),
fields=("user", "property", "end_time"),
name="unique_user_count_null_subgroup",
),
),
]

View File

@@ -2,139 +2,91 @@ import datetime
from typing import Optional from typing import Optional
from django.db import models from django.db import models
from django.db.models import Q, UniqueConstraint
from zerver.lib.timestamp import floor_to_day from zerver.lib.timestamp import floor_to_day
from zerver.models import Realm, Stream, UserProfile from zerver.models import Realm, Stream, UserProfile
class FillState(models.Model): class FillState(models.Model):
property: str = models.CharField(max_length=40, unique=True) property = models.CharField(max_length=40, unique=True) # type: str
end_time: datetime.datetime = models.DateTimeField() end_time = models.DateTimeField() # type: datetime.datetime
# Valid states are {DONE, STARTED} # Valid states are {DONE, STARTED}
DONE = 1 DONE = 1
STARTED = 2 STARTED = 2
state: int = models.PositiveSmallIntegerField() state = models.PositiveSmallIntegerField() # type: int
last_modified = models.DateTimeField(auto_now=True) # type: datetime.datetime
def __str__(self) -> str: def __str__(self) -> str:
return f"<FillState: {self.property} {self.end_time} {self.state}>" return "<FillState: %s %s %s>" % (self.property, self.end_time, self.state)
# The earliest/starting end_time in FillState # The earliest/starting end_time in FillState
# We assume there is at least one realm # We assume there is at least one realm
def installation_epoch() -> datetime.datetime: def installation_epoch() -> datetime.datetime:
earliest_realm_creation = Realm.objects.aggregate(models.Min("date_created"))[ earliest_realm_creation = Realm.objects.aggregate(models.Min('date_created'))['date_created__min']
"date_created__min"
]
return floor_to_day(earliest_realm_creation) return floor_to_day(earliest_realm_creation)
def last_successful_fill(property: str) -> Optional[datetime.datetime]:
fillstate = FillState.objects.filter(property=property).first()
if fillstate is None:
return None
if fillstate.state == FillState.DONE:
return fillstate.end_time
return fillstate.end_time - datetime.timedelta(hours=1)
class BaseCount(models.Model): class BaseCount(models.Model):
# Note: When inheriting from BaseCount, you may want to rearrange # Note: When inheriting from BaseCount, you may want to rearrange
# the order of the columns in the migration to make sure they # the order of the columns in the migration to make sure they
# match how you'd like the table to be arranged. # match how you'd like the table to be arranged.
property: str = models.CharField(max_length=32) property = models.CharField(max_length=32) # type: str
subgroup: Optional[str] = models.CharField(max_length=16, null=True) subgroup = models.CharField(max_length=16, null=True) # type: Optional[str]
end_time: datetime.datetime = models.DateTimeField() end_time = models.DateTimeField() # type: datetime.datetime
value: int = models.BigIntegerField() value = models.BigIntegerField() # type: int
class Meta: class Meta:
abstract = True abstract = True
class InstallationCount(BaseCount): class InstallationCount(BaseCount):
class Meta: class Meta:
# Handles invalid duplicate InstallationCount data unique_together = ("property", "subgroup", "end_time")
constraints = [
UniqueConstraint(
fields=["property", "subgroup", "end_time"],
condition=Q(subgroup__isnull=False),
name="unique_installation_count",
),
UniqueConstraint(
fields=["property", "end_time"],
condition=Q(subgroup__isnull=True),
name="unique_installation_count_null_subgroup",
),
]
def __str__(self) -> str: def __str__(self) -> str:
return f"<InstallationCount: {self.property} {self.subgroup} {self.value}>" return "<InstallationCount: %s %s %s>" % (self.property, self.subgroup, self.value)
class RealmCount(BaseCount): class RealmCount(BaseCount):
realm = models.ForeignKey(Realm, on_delete=models.CASCADE) realm = models.ForeignKey(Realm, on_delete=models.CASCADE)
class Meta: class Meta:
# Handles invalid duplicate RealmCount data unique_together = ("realm", "property", "subgroup", "end_time")
constraints = [
UniqueConstraint(
fields=["realm", "property", "subgroup", "end_time"],
condition=Q(subgroup__isnull=False),
name="unique_realm_count",
),
UniqueConstraint(
fields=["realm", "property", "end_time"],
condition=Q(subgroup__isnull=True),
name="unique_realm_count_null_subgroup",
),
]
index_together = ["property", "end_time"] index_together = ["property", "end_time"]
def __str__(self) -> str: def __str__(self) -> str:
return f"<RealmCount: {self.realm} {self.property} {self.subgroup} {self.value}>" return "<RealmCount: %s %s %s %s>" % (self.realm, self.property, self.subgroup, self.value)
class UserCount(BaseCount): class UserCount(BaseCount):
user = models.ForeignKey(UserProfile, on_delete=models.CASCADE) user = models.ForeignKey(UserProfile, on_delete=models.CASCADE)
realm = models.ForeignKey(Realm, on_delete=models.CASCADE) realm = models.ForeignKey(Realm, on_delete=models.CASCADE)
class Meta: class Meta:
# Handles invalid duplicate UserCount data unique_together = ("user", "property", "subgroup", "end_time")
constraints = [
UniqueConstraint(
fields=["user", "property", "subgroup", "end_time"],
condition=Q(subgroup__isnull=False),
name="unique_user_count",
),
UniqueConstraint(
fields=["user", "property", "end_time"],
condition=Q(subgroup__isnull=True),
name="unique_user_count_null_subgroup",
),
]
# This index dramatically improves the performance of # This index dramatically improves the performance of
# aggregating from users to realms # aggregating from users to realms
index_together = ["property", "realm", "end_time"] index_together = ["property", "realm", "end_time"]
def __str__(self) -> str: def __str__(self) -> str:
return f"<UserCount: {self.user} {self.property} {self.subgroup} {self.value}>" return "<UserCount: %s %s %s %s>" % (self.user, self.property, self.subgroup, self.value)
class StreamCount(BaseCount): class StreamCount(BaseCount):
stream = models.ForeignKey(Stream, on_delete=models.CASCADE) stream = models.ForeignKey(Stream, on_delete=models.CASCADE)
realm = models.ForeignKey(Realm, on_delete=models.CASCADE) realm = models.ForeignKey(Realm, on_delete=models.CASCADE)
class Meta: class Meta:
# Handles invalid duplicate StreamCount data unique_together = ("stream", "property", "subgroup", "end_time")
constraints = [
UniqueConstraint(
fields=["stream", "property", "subgroup", "end_time"],
condition=Q(subgroup__isnull=False),
name="unique_stream_count",
),
UniqueConstraint(
fields=["stream", "property", "end_time"],
condition=Q(subgroup__isnull=True),
name="unique_stream_count_null_subgroup",
),
]
# This index dramatically improves the performance of # This index dramatically improves the performance of
# aggregating from streams to realms # aggregating from streams to realms
index_together = ["property", "realm", "end_time"] index_together = ["property", "realm", "end_time"]
def __str__(self) -> str: def __str__(self) -> str:
return ( return "<StreamCount: %s %s %s %s %s>" % (
f"<StreamCount: {self.stream} {self.property} {self.subgroup} {self.value} {self.id}>" self.stream, self.property, self.subgroup, self.value, self.id)
)

File diff suppressed because it is too large Load Diff

View File

@@ -2,39 +2,28 @@ from analytics.lib.counts import CountStat
from analytics.lib.fixtures import generate_time_series_data from analytics.lib.fixtures import generate_time_series_data
from zerver.lib.test_classes import ZulipTestCase from zerver.lib.test_classes import ZulipTestCase
# A very light test suite; the code being tested is not run in production. # A very light test suite; the code being tested is not run in production.
class TestFixtures(ZulipTestCase): class TestFixtures(ZulipTestCase):
def test_deterministic_settings(self) -> None: def test_deterministic_settings(self) -> None:
# test basic business_hour / non_business_hour calculation # test basic business_hour / non_business_hour calculation
# test we get an array of the right length with frequency=CountStat.DAY # test we get an array of the right length with frequency=CountStat.DAY
data = generate_time_series_data( data = generate_time_series_data(
days=7, business_hours_base=20, non_business_hours_base=15, spikiness=0 days=7, business_hours_base=20, non_business_hours_base=15, spikiness=0)
)
self.assertEqual(data, [400, 400, 400, 400, 400, 360, 360]) self.assertEqual(data, [400, 400, 400, 400, 400, 360, 360])
data = generate_time_series_data( data = generate_time_series_data(
days=1, days=1, business_hours_base=2000, non_business_hours_base=1500,
business_hours_base=2000, growth=2, spikiness=0, frequency=CountStat.HOUR)
non_business_hours_base=1500,
growth=2,
spikiness=0,
frequency=CountStat.HOUR,
)
# test we get an array of the right length with frequency=CountStat.HOUR # test we get an array of the right length with frequency=CountStat.HOUR
self.assertEqual(len(data), 24) self.assertEqual(len(data), 24)
# test that growth doesn't affect the first data point # test that growth doesn't affect the first data point
self.assertEqual(data[0], 2000) self.assertEqual(data[0], 2000)
# test that the last data point is growth times what it otherwise would be # test that the last data point is growth times what it otherwise would be
self.assertEqual(data[-1], 1500 * 2) self.assertEqual(data[-1], 1500*2)
# test autocorrelation == 1, since that's the easiest value to test # test autocorrelation == 1, since that's the easiest value to test
data = generate_time_series_data( data = generate_time_series_data(
days=1, days=1, business_hours_base=2000, non_business_hours_base=2000,
business_hours_base=2000, autocorrelation=1, frequency=CountStat.HOUR)
non_business_hours_base=2000,
autocorrelation=1,
frequency=CountStat.HOUR,
)
self.assertEqual(data[0], data[1]) self.assertEqual(data[0], data[1])
self.assertEqual(data[0], data[-1]) self.assertEqual(data[0], data[-1])

File diff suppressed because it is too large Load Diff

View File

@@ -1,38 +1,31 @@
from django.conf.urls import include from django.conf.urls import include, url
from django.urls import path
from analytics.views import ( import analytics.views
get_activity, from zerver.lib.rest import rest_dispatch
get_chart_data,
get_chart_data_for_installation,
get_chart_data_for_realm,
get_chart_data_for_remote_installation,
get_chart_data_for_remote_realm,
get_realm_activity,
get_user_activity,
stats,
stats_for_installation,
stats_for_realm,
stats_for_remote_installation,
stats_for_remote_realm,
support,
)
from zerver.lib.rest import rest_path
i18n_urlpatterns = [ i18n_urlpatterns = [
# Server admin (user_profile.is_staff) visible stats pages # Server admin (user_profile.is_staff) visible stats pages
path("activity", get_activity), url(r'^activity$', analytics.views.get_activity,
path("activity/support", support, name="support"), name='analytics.views.get_activity'),
path("realm_activity/<realm_str>/", get_realm_activity), url(r'^realm_activity/(?P<realm_str>[\S]+)/$', analytics.views.get_realm_activity,
path("user_activity/<email>/", get_user_activity), name='analytics.views.get_realm_activity'),
path("stats/realm/<realm_str>/", stats_for_realm), url(r'^user_activity/(?P<email>[\S]+)/$', analytics.views.get_user_activity,
path("stats/installation", stats_for_installation), name='analytics.views.get_user_activity'),
path("stats/remote/<int:remote_server_id>/installation", stats_for_remote_installation),
path( url(r'^stats/realm/(?P<realm_str>[\S]+)/$', analytics.views.stats_for_realm,
"stats/remote/<int:remote_server_id>/realm/<int:remote_realm_id>/", stats_for_remote_realm name='analytics.views.stats_for_realm'),
), url(r'^stats/installation$', analytics.views.stats_for_installation,
name='analytics.views.stats_for_installation'),
url(r'^stats/remote/(?P<remote_server_id>[\S]+)/installation$',
analytics.views.stats_for_remote_installation,
name='analytics.views.stats_for_remote_installation'),
url(r'^stats/remote/(?P<remote_server_id>[\S]+)/realm/(?P<remote_realm_id>[\S]+)/$',
analytics.views.stats_for_remote_realm,
name='analytics.views.stats_for_remote_realm'),
# User-visible stats page # User-visible stats page
path("stats", stats, name="stats"), url(r'^stats$', analytics.views.stats,
name='analytics.views.stats'),
] ]
# These endpoints are a part of the API (V1), which uses: # These endpoints are a part of the API (V1), which uses:
@@ -45,22 +38,22 @@ i18n_urlpatterns = [
# All of these paths are accessed by either a /json or /api prefix # All of these paths are accessed by either a /json or /api prefix
v1_api_and_json_patterns = [ v1_api_and_json_patterns = [
# get data for the graphs at /stats # get data for the graphs at /stats
rest_path("analytics/chart_data", GET=get_chart_data), url(r'^analytics/chart_data$', rest_dispatch,
rest_path("analytics/chart_data/realm/<realm_str>", GET=get_chart_data_for_realm), {'GET': 'analytics.views.get_chart_data'}),
rest_path("analytics/chart_data/installation", GET=get_chart_data_for_installation), url(r'^analytics/chart_data/realm/(?P<realm_str>[\S]+)$', rest_dispatch,
rest_path( {'GET': 'analytics.views.get_chart_data_for_realm'}),
"analytics/chart_data/remote/<int:remote_server_id>/installation", url(r'^analytics/chart_data/installation$', rest_dispatch,
GET=get_chart_data_for_remote_installation, {'GET': 'analytics.views.get_chart_data_for_installation'}),
), url(r'^analytics/chart_data/remote/(?P<remote_server_id>[\S]+)/installation$', rest_dispatch,
rest_path( {'GET': 'analytics.views.get_chart_data_for_remote_installation'}),
"analytics/chart_data/remote/<int:remote_server_id>/realm/<int:remote_realm_id>", url(r'^analytics/chart_data/remote/(?P<remote_server_id>[\S]+)/realm/(?P<remote_realm_id>[\S]+)$',
GET=get_chart_data_for_remote_realm, rest_dispatch,
), {'GET': 'analytics.views.get_chart_data_for_remote_realm'}),
] ]
i18n_urlpatterns += [ i18n_urlpatterns += [
path("api/v1/", include(v1_api_and_json_patterns)), url(r'^api/v1/', include(v1_api_and_json_patterns)),
path("json/", include(v1_api_and_json_patterns)), url(r'^json/', include(v1_api_and_json_patterns)),
] ]
urlpatterns = i18n_urlpatterns urlpatterns = i18n_urlpatterns

File diff suppressed because it is too large Load Diff

View File

@@ -1,26 +0,0 @@
"use strict";
module.exports = {
plugins: [
[
"formatjs",
{
additionalFunctionNames: ["$t", "$t_html"],
overrideIdFn: (id, defaultMessage) => defaultMessage,
},
],
],
presets: [
[
"@babel/preset-env",
{
corejs: "3.6",
loose: true, // Loose mode for…of loops are 5× faster in Firefox
shippedProposals: true,
useBuiltIns: "usage",
},
],
"@babel/typescript",
],
sourceType: "unambiguous",
};

View File

@@ -1,3 +1,5 @@
# -*- coding: utf-8 -*-
# Copyright: (c) 2008, Jarek Zgoda <jarek.zgoda@gmail.com> # Copyright: (c) 2008, Jarek Zgoda <jarek.zgoda@gmail.com>
# Permission is hereby granted, free of charge, to any person obtaining a # Permission is hereby granted, free of charge, to any person obtaining a
@@ -19,4 +21,4 @@
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
# IN THE SOFTWARE. # IN THE SOFTWARE.
VERSION = (0, 9, "pre") VERSION = (0, 9, 'pre')

View File

@@ -1,39 +1,27 @@
# -*- coding: utf-8 -*-
from django.db import models, migrations
import django.db.models.deletion import django.db.models.deletion
from django.db import migrations, models
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [ dependencies = [
("contenttypes", "0001_initial"), ('contenttypes', '0001_initial'),
] ]
operations = [ operations = [
migrations.CreateModel( migrations.CreateModel(
name="Confirmation", name='Confirmation',
fields=[ fields=[
( ('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
"id", ('object_id', models.PositiveIntegerField()),
models.AutoField( ('date_sent', models.DateTimeField(verbose_name='sent')),
verbose_name="ID", serialize=False, auto_created=True, primary_key=True ('confirmation_key', models.CharField(max_length=40, verbose_name='activation key')),
), ('content_type', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='contenttypes.ContentType')),
),
("object_id", models.PositiveIntegerField()),
("date_sent", models.DateTimeField(verbose_name="sent")),
(
"confirmation_key",
models.CharField(max_length=40, verbose_name="activation key"),
),
(
"content_type",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="contenttypes.ContentType"
),
),
], ],
options={ options={
"verbose_name": "confirmation email", 'verbose_name': 'confirmation email',
"verbose_name_plural": "confirmation emails", 'verbose_name_plural': 'confirmation emails',
}, },
bases=(models.Model,), bases=(models.Model,),
), ),

View File

@@ -1,28 +1,21 @@
# -*- coding: utf-8 -*-
from django.db import models, migrations
import django.utils.timezone import django.utils.timezone
from django.db import migrations, models
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [ dependencies = [
("confirmation", "0001_initial"), ('confirmation', '0001_initial'),
] ]
operations = [ operations = [
migrations.CreateModel( migrations.CreateModel(
name="RealmCreationKey", name='RealmCreationKey',
fields=[ fields=[
( ('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
"id", ('creation_key', models.CharField(max_length=40, verbose_name='activation key')),
models.AutoField( ('date_created', models.DateTimeField(default=django.utils.timezone.now, verbose_name='created')),
verbose_name="ID", serialize=False, auto_created=True, primary_key=True
),
),
("creation_key", models.CharField(max_length=40, verbose_name="activation key")),
(
"date_created",
models.DateTimeField(default=django.utils.timezone.now, verbose_name="created"),
),
], ],
), ),
] ]

View File

@@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.10.4 on 2017-01-17 09:16 # Generated by Django 1.10.4 on 2017-01-17 09:16
from django.db import migrations from django.db import migrations
@@ -5,16 +6,17 @@ from django.db import migrations
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [ dependencies = [
("confirmation", "0002_realmcreationkey"), ('confirmation', '0002_realmcreationkey'),
] ]
operations = [ operations = [
migrations.CreateModel( migrations.CreateModel(
name="EmailChangeConfirmation", name='EmailChangeConfirmation',
fields=[], fields=[
],
options={ options={
"proxy": True, 'proxy': True,
}, },
bases=("confirmation.confirmation",), bases=('confirmation.confirmation',),
), ),
] ]

View File

@@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.2 on 2017-07-08 04:23 # Generated by Django 1.11.2 on 2017-07-08 04:23
from django.db import migrations, models from django.db import migrations, models
@@ -5,31 +6,31 @@ from django.db import migrations, models
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [ dependencies = [
("confirmation", "0003_emailchangeconfirmation"), ('confirmation', '0003_emailchangeconfirmation'),
] ]
operations = [ operations = [
migrations.DeleteModel( migrations.DeleteModel(
name="EmailChangeConfirmation", name='EmailChangeConfirmation',
), ),
migrations.AlterModelOptions( migrations.AlterModelOptions(
name="confirmation", name='confirmation',
options={}, options={},
), ),
migrations.AddField( migrations.AddField(
model_name="confirmation", model_name='confirmation',
name="type", name='type',
field=models.PositiveSmallIntegerField(default=1), field=models.PositiveSmallIntegerField(default=1),
preserve_default=False, preserve_default=False,
), ),
migrations.AlterField( migrations.AlterField(
model_name="confirmation", model_name='confirmation',
name="confirmation_key", name='confirmation_key',
field=models.CharField(max_length=40), field=models.CharField(max_length=40),
), ),
migrations.AlterField( migrations.AlterField(
model_name="confirmation", model_name='confirmation',
name="date_sent", name='date_sent',
field=models.DateTimeField(), field=models.DateTimeField(),
), ),
] ]

View File

@@ -1,21 +1,22 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.6 on 2017-11-30 00:13 # Generated by Django 1.11.6 on 2017-11-30 00:13
import django.db.models.deletion from __future__ import unicode_literals
from django.db import migrations, models from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [ dependencies = [
("zerver", "0124_stream_enable_notifications"), ('zerver', '0124_stream_enable_notifications'),
("confirmation", "0004_remove_confirmationmanager"), ('confirmation', '0004_remove_confirmationmanager'),
] ]
operations = [ operations = [
migrations.AddField( migrations.AddField(
model_name="confirmation", model_name='confirmation',
name="realm", name='realm',
field=models.ForeignKey( field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, to='zerver.Realm'),
null=True, on_delete=django.db.models.deletion.CASCADE, to="zerver.Realm"
),
), ),
] ]

View File

@@ -1,4 +1,6 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.6 on 2018-01-29 18:39 # Generated by Django 1.11.6 on 2018-01-29 18:39
from __future__ import unicode_literals
from django.db import migrations, models from django.db import migrations, models
@@ -6,13 +8,13 @@ from django.db import migrations, models
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [ dependencies = [
("confirmation", "0005_confirmation_realm"), ('confirmation', '0005_confirmation_realm'),
] ]
operations = [ operations = [
migrations.AddField( migrations.AddField(
model_name="realmcreationkey", model_name='realmcreationkey',
name="presume_email_valid", name='presume_email_valid',
field=models.BooleanField(default=False), field=models.BooleanField(default=False),
), ),
] ]

View File

@@ -1,37 +0,0 @@
# Generated by Django 2.2.10 on 2020-03-27 09:02
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("confirmation", "0006_realmcreationkey_presume_email_valid"),
]
operations = [
migrations.AlterField(
model_name="confirmation",
name="confirmation_key",
field=models.CharField(db_index=True, max_length=40),
),
migrations.AlterField(
model_name="confirmation",
name="date_sent",
field=models.DateTimeField(db_index=True),
),
migrations.AlterField(
model_name="confirmation",
name="object_id",
field=models.PositiveIntegerField(db_index=True),
),
migrations.AlterField(
model_name="realmcreationkey",
name="creation_key",
field=models.CharField(db_index=True, max_length=40, verbose_name="activation key"),
),
migrations.AlterUniqueTogether(
name="confirmation",
unique_together={("type", "confirmation_key")},
),
]

View File

@@ -1,24 +1,26 @@
# -*- coding: utf-8 -*-
# Copyright: (c) 2008, Jarek Zgoda <jarek.zgoda@gmail.com> # Copyright: (c) 2008, Jarek Zgoda <jarek.zgoda@gmail.com>
__revision__ = "$Id: models.py 28 2009-10-22 15:03:02Z jarek.zgoda $" __revision__ = '$Id: models.py 28 2009-10-22 15:03:02Z jarek.zgoda $'
import datetime
import secrets import datetime
from base64 import b32encode
from typing import Mapping, Optional, Union
from urllib.parse import urljoin
from django.conf import settings
from django.contrib.contenttypes.fields import GenericForeignKey
from django.contrib.contenttypes.models import ContentType
from django.db import models from django.db import models
from django.db.models import CASCADE from django.db.models import CASCADE
from django.urls import reverse
from django.conf import settings
from django.contrib.contenttypes.models import ContentType
from django.contrib.contenttypes.fields import GenericForeignKey
from django.http import HttpRequest, HttpResponse from django.http import HttpRequest, HttpResponse
from django.shortcuts import render from django.shortcuts import render
from django.urls import reverse
from django.utils.timezone import now as timezone_now from django.utils.timezone import now as timezone_now
from zerver.models import EmailChangeStatus, MultiuseInvite, PreregistrationUser, Realm, UserProfile from zerver.models import PreregistrationUser, EmailChangeStatus, MultiuseInvite, \
UserProfile, Realm
from random import SystemRandom
import string
from typing import Dict, Optional, Union
class ConfirmationKeyException(Exception): class ConfirmationKeyException(Exception):
WRONG_LENGTH = 1 WRONG_LENGTH = 1
@@ -29,35 +31,27 @@ class ConfirmationKeyException(Exception):
super().__init__() super().__init__()
self.error_type = error_type self.error_type = error_type
def render_confirmation_key_error(request: HttpRequest, exception: ConfirmationKeyException) -> HttpResponse:
def render_confirmation_key_error(
request: HttpRequest, exception: ConfirmationKeyException
) -> HttpResponse:
if exception.error_type == ConfirmationKeyException.WRONG_LENGTH: if exception.error_type == ConfirmationKeyException.WRONG_LENGTH:
return render(request, "confirmation/link_malformed.html", status=404) return render(request, 'confirmation/link_malformed.html')
if exception.error_type == ConfirmationKeyException.EXPIRED: if exception.error_type == ConfirmationKeyException.EXPIRED:
return render(request, "confirmation/link_expired.html", status=404) return render(request, 'confirmation/link_expired.html')
return render(request, "confirmation/link_does_not_exist.html", status=404) return render(request, 'confirmation/link_does_not_exist.html')
def generate_key() -> str: def generate_key() -> str:
generator = SystemRandom()
# 24 characters * 5 bits of entropy/character = 120 bits of entropy # 24 characters * 5 bits of entropy/character = 120 bits of entropy
return b32encode(secrets.token_bytes(15)).decode().lower() return ''.join(generator.choice(string.ascii_lowercase + string.digits) for _ in range(24))
ConfirmationObjT = Union[MultiuseInvite, PreregistrationUser, EmailChangeStatus] ConfirmationObjT = Union[MultiuseInvite, PreregistrationUser, EmailChangeStatus]
def get_object_from_key(confirmation_key: str,
confirmation_type: int) -> ConfirmationObjT:
def get_object_from_key(
confirmation_key: str, confirmation_type: int, activate_object: bool = True
) -> ConfirmationObjT:
# Confirmation keys used to be 40 characters # Confirmation keys used to be 40 characters
if len(confirmation_key) not in (24, 40): if len(confirmation_key) not in (24, 40):
raise ConfirmationKeyException(ConfirmationKeyException.WRONG_LENGTH) raise ConfirmationKeyException(ConfirmationKeyException.WRONG_LENGTH)
try: try:
confirmation = Confirmation.objects.get( confirmation = Confirmation.objects.get(confirmation_key=confirmation_key,
confirmation_key=confirmation_key, type=confirmation_type type=confirmation_type)
)
except Confirmation.DoesNotExist: except Confirmation.DoesNotExist:
raise ConfirmationKeyException(ConfirmationKeyException.DOES_NOT_EXIST) raise ConfirmationKeyException(ConfirmationKeyException.DOES_NOT_EXIST)
@@ -66,53 +60,38 @@ def get_object_from_key(
raise ConfirmationKeyException(ConfirmationKeyException.EXPIRED) raise ConfirmationKeyException(ConfirmationKeyException.EXPIRED)
obj = confirmation.content_object obj = confirmation.content_object
if activate_object and hasattr(obj, "status"): if hasattr(obj, "status"):
obj.status = getattr(settings, "STATUS_ACTIVE", 1) obj.status = getattr(settings, 'STATUS_ACTIVE', 1)
obj.save(update_fields=["status"]) obj.save(update_fields=['status'])
return obj return obj
def create_confirmation_link(obj: ContentType, host: str,
def create_confirmation_link( confirmation_type: int,
obj: ContentType, confirmation_type: int, url_args: Mapping[str, str] = {} url_args: Optional[Dict[str, str]]=None) -> str:
) -> str:
key = generate_key() key = generate_key()
realm = None realm = None
if hasattr(obj, "realm"): if hasattr(obj, 'realm'):
realm = obj.realm realm = obj.realm
elif isinstance(obj, Realm): Confirmation.objects.create(content_object=obj, date_sent=timezone_now(), confirmation_key=key,
realm = obj realm=realm, type=confirmation_type)
return confirmation_url(key, host, confirmation_type, url_args)
Confirmation.objects.create( def confirmation_url(confirmation_key: str, host: str,
content_object=obj,
date_sent=timezone_now(),
confirmation_key=key,
realm=realm,
type=confirmation_type,
)
return confirmation_url(key, realm, confirmation_type, url_args)
def confirmation_url(
confirmation_key: str,
realm: Optional[Realm],
confirmation_type: int, confirmation_type: int,
url_args: Mapping[str, str] = {}, url_args: Optional[Dict[str, str]]=None) -> str:
) -> str: if url_args is None:
url_args = dict(url_args) url_args = {}
url_args["confirmation_key"] = confirmation_key url_args['confirmation_key'] = confirmation_key
return urljoin( return '%s%s%s' % (settings.EXTERNAL_URI_SCHEME, host,
settings.ROOT_DOMAIN_URI if realm is None else realm.uri, reverse(_properties[confirmation_type].url_name, kwargs=url_args))
reverse(_properties[confirmation_type].url_name, kwargs=url_args),
)
class Confirmation(models.Model): class Confirmation(models.Model):
content_type = models.ForeignKey(ContentType, on_delete=CASCADE) content_type = models.ForeignKey(ContentType, on_delete=CASCADE)
object_id: int = models.PositiveIntegerField(db_index=True) object_id = models.PositiveIntegerField() # type: int
content_object = GenericForeignKey("content_type", "object_id") content_object = GenericForeignKey('content_type', 'object_id')
date_sent: datetime.datetime = models.DateTimeField(db_index=True) date_sent = models.DateTimeField() # type: datetime.datetime
confirmation_key: str = models.CharField(max_length=40, db_index=True) confirmation_key = models.CharField(max_length=40) # type: str
realm: Optional[Realm] = models.ForeignKey(Realm, null=True, on_delete=CASCADE) realm = models.ForeignKey(Realm, null=True, on_delete=CASCADE) # type: Optional[Realm]
# The following list is the set of valid types # The following list is the set of valid types
USER_REGISTRATION = 1 USER_REGISTRATION = 1
@@ -123,52 +102,39 @@ class Confirmation(models.Model):
MULTIUSE_INVITE = 6 MULTIUSE_INVITE = 6
REALM_CREATION = 7 REALM_CREATION = 7
REALM_REACTIVATION = 8 REALM_REACTIVATION = 8
type: int = models.PositiveSmallIntegerField() type = models.PositiveSmallIntegerField() # type: int
def __str__(self) -> str: def __str__(self) -> str:
return f"<Confirmation: {self.content_object}>" return '<Confirmation: %s>' % (self.content_object,)
class Meta:
unique_together = ("type", "confirmation_key")
class ConfirmationType: class ConfirmationType:
def __init__( def __init__(self, url_name: str,
self, validity_in_days: int=settings.CONFIRMATION_LINK_DEFAULT_VALIDITY_DAYS) -> None:
url_name: str,
validity_in_days: int = settings.CONFIRMATION_LINK_DEFAULT_VALIDITY_DAYS,
) -> None:
self.url_name = url_name self.url_name = url_name
self.validity_in_days = validity_in_days self.validity_in_days = validity_in_days
_properties = { _properties = {
Confirmation.USER_REGISTRATION: ConfirmationType("get_prereg_key_and_redirect"), Confirmation.USER_REGISTRATION: ConfirmationType('check_prereg_key_and_redirect'),
Confirmation.INVITATION: ConfirmationType( Confirmation.INVITATION: ConfirmationType('check_prereg_key_and_redirect',
"get_prereg_key_and_redirect", validity_in_days=settings.INVITATION_LINK_VALIDITY_DAYS validity_in_days=settings.INVITATION_LINK_VALIDITY_DAYS),
), Confirmation.EMAIL_CHANGE: ConfirmationType('zerver.views.user_settings.confirm_email_change'),
Confirmation.EMAIL_CHANGE: ConfirmationType("confirm_email_change"), Confirmation.UNSUBSCRIBE: ConfirmationType('zerver.views.unsubscribe.email_unsubscribe',
Confirmation.UNSUBSCRIBE: ConfirmationType( validity_in_days=1000000), # should never expire
"unsubscribe",
validity_in_days=1000000, # should never expire
),
Confirmation.MULTIUSE_INVITE: ConfirmationType( Confirmation.MULTIUSE_INVITE: ConfirmationType(
"join", validity_in_days=settings.INVITATION_LINK_VALIDITY_DAYS 'zerver.views.registration.accounts_home_from_multiuse_invite',
), validity_in_days=settings.INVITATION_LINK_VALIDITY_DAYS),
Confirmation.REALM_CREATION: ConfirmationType("get_prereg_key_and_redirect"), Confirmation.REALM_CREATION: ConfirmationType('check_prereg_key_and_redirect'),
Confirmation.REALM_REACTIVATION: ConfirmationType("realm_reactivation"), Confirmation.REALM_REACTIVATION: ConfirmationType('zerver.views.realm.realm_reactivation'),
} }
def one_click_unsubscribe_link(user_profile: UserProfile, email_type: str) -> str: def one_click_unsubscribe_link(user_profile: UserProfile, email_type: str) -> str:
""" """
Generate a unique link that a logged-out user can visit to unsubscribe from Generate a unique link that a logged-out user can visit to unsubscribe from
Zulip e-mails without having to first log in. Zulip e-mails without having to first log in.
""" """
return create_confirmation_link( return create_confirmation_link(user_profile, user_profile.realm.host,
user_profile, Confirmation.UNSUBSCRIBE, url_args={"email_type": email_type} Confirmation.UNSUBSCRIBE,
) url_args = {'email_type': email_type})
# Functions related to links generated by the generate_realm_creation_link.py # Functions related to links generated by the generate_realm_creation_link.py
# management command. # management command.
@@ -178,8 +144,7 @@ def one_click_unsubscribe_link(user_profile: UserProfile, email_type: str) -> st
# Arguably RealmCreationKey should just be another ConfirmationObjT and we should # Arguably RealmCreationKey should just be another ConfirmationObjT and we should
# add another Confirmation.type for this; it's this way for historical reasons. # add another Confirmation.type for this; it's this way for historical reasons.
def validate_key(creation_key: Optional[str]) -> Optional['RealmCreationKey']:
def validate_key(creation_key: Optional[str]) -> Optional["RealmCreationKey"]:
"""Get the record for this key, raising InvalidCreationKey if non-None but invalid.""" """Get the record for this key, raising InvalidCreationKey if non-None but invalid."""
if creation_key is None: if creation_key is None:
return None return None
@@ -192,25 +157,23 @@ def validate_key(creation_key: Optional[str]) -> Optional["RealmCreationKey"]:
raise RealmCreationKey.Invalid() raise RealmCreationKey.Invalid()
return key_record return key_record
def generate_realm_creation_url(by_admin: bool=False) -> str:
def generate_realm_creation_url(by_admin: bool = False) -> str:
key = generate_key() key = generate_key()
RealmCreationKey.objects.create( RealmCreationKey.objects.create(creation_key=key,
creation_key=key, date_created=timezone_now(), presume_email_valid=by_admin date_created=timezone_now(),
) presume_email_valid=by_admin)
return urljoin( return '%s%s%s' % (settings.EXTERNAL_URI_SCHEME,
settings.ROOT_DOMAIN_URI, settings.EXTERNAL_HOST,
reverse("create_realm", kwargs={"creation_key": key}), reverse('zerver.views.create_realm',
) kwargs={'creation_key': key}))
class RealmCreationKey(models.Model): class RealmCreationKey(models.Model):
creation_key = models.CharField("activation key", db_index=True, max_length=40) creation_key = models.CharField('activation key', max_length=40)
date_created = models.DateTimeField("created", default=timezone_now) date_created = models.DateTimeField('created', default=timezone_now)
# True just if we should presume the email address the user enters # True just if we should presume the email address the user enters
# is theirs, and skip sending mail to it to confirm that. # is theirs, and skip sending mail to it to confirm that.
presume_email_valid: bool = models.BooleanField(default=False) presume_email_valid = models.BooleanField(default=False) # type: bool
class Invalid(Exception): class Invalid(Exception):
pass pass

View File

@@ -1,6 +1,7 @@
# -*- coding: utf-8 -*-
# Copyright: (c) 2008, Jarek Zgoda <jarek.zgoda@gmail.com> # Copyright: (c) 2008, Jarek Zgoda <jarek.zgoda@gmail.com>
__revision__ = "$Id: settings.py 12 2008-11-23 19:38:52Z jarek.zgoda $" __revision__ = '$Id: settings.py 12 2008-11-23 19:38:52Z jarek.zgoda $'
STATUS_ACTIVE = 1 STATUS_ACTIVE = 1
STATUS_REVOKED = 2

File diff suppressed because it is too large Load Diff

View File

@@ -1,7 +1,9 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.14 on 2018-09-25 12:02 # Generated by Django 1.11.14 on 2018-09-25 12:02
from __future__ import unicode_literals
import django.db.models.deletion
from django.db import migrations, models from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration): class Migration(migrations.Migration):
@@ -9,78 +11,43 @@ class Migration(migrations.Migration):
initial = True initial = True
dependencies = [ dependencies = [
("zerver", "0189_userprofile_add_some_emojisets"), ('zerver', '0189_userprofile_add_some_emojisets'),
] ]
operations = [ operations = [
migrations.CreateModel( migrations.CreateModel(
name="BillingProcessor", name='BillingProcessor',
fields=[ fields=[
( ('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
"id", ('state', models.CharField(max_length=20)),
models.AutoField( ('last_modified', models.DateTimeField(auto_now=True)),
auto_created=True, primary_key=True, serialize=False, verbose_name="ID" ('log_row', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='zerver.RealmAuditLog')),
), ('realm', models.OneToOneField(null=True, on_delete=django.db.models.deletion.CASCADE, to='zerver.Realm')),
),
("state", models.CharField(max_length=20)),
("last_modified", models.DateTimeField(auto_now=True)),
(
"log_row",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="zerver.RealmAuditLog"
),
),
(
"realm",
models.OneToOneField(
null=True, on_delete=django.db.models.deletion.CASCADE, to="zerver.Realm"
),
),
], ],
), ),
migrations.CreateModel( migrations.CreateModel(
name="Coupon", name='Coupon',
fields=[ fields=[
( ('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
"id", ('percent_off', models.SmallIntegerField(unique=True)),
models.AutoField( ('stripe_coupon_id', models.CharField(max_length=255, unique=True)),
auto_created=True, primary_key=True, serialize=False, verbose_name="ID"
),
),
("percent_off", models.SmallIntegerField(unique=True)),
("stripe_coupon_id", models.CharField(max_length=255, unique=True)),
], ],
), ),
migrations.CreateModel( migrations.CreateModel(
name="Customer", name='Customer',
fields=[ fields=[
( ('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
"id", ('stripe_customer_id', models.CharField(max_length=255, unique=True)),
models.AutoField( ('has_billing_relationship', models.BooleanField(default=False)),
auto_created=True, primary_key=True, serialize=False, verbose_name="ID" ('realm', models.OneToOneField(on_delete=django.db.models.deletion.CASCADE, to='zerver.Realm')),
),
),
("stripe_customer_id", models.CharField(max_length=255, unique=True)),
("has_billing_relationship", models.BooleanField(default=False)),
(
"realm",
models.OneToOneField(
on_delete=django.db.models.deletion.CASCADE, to="zerver.Realm"
),
),
], ],
), ),
migrations.CreateModel( migrations.CreateModel(
name="Plan", name='Plan',
fields=[ fields=[
( ('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
"id", ('nickname', models.CharField(max_length=40, unique=True)),
models.AutoField( ('stripe_plan_id', models.CharField(max_length=255, unique=True)),
auto_created=True, primary_key=True, serialize=False, verbose_name="ID"
),
),
("nickname", models.CharField(max_length=40, unique=True)),
("stripe_plan_id", models.CharField(max_length=255, unique=True)),
], ],
), ),
] ]

View File

@@ -1,4 +1,6 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.16 on 2018-12-12 20:19 # Generated by Django 1.11.16 on 2018-12-12 20:19
from __future__ import unicode_literals
from django.db import migrations, models from django.db import migrations, models
@@ -6,13 +8,13 @@ from django.db import migrations, models
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [ dependencies = [
("corporate", "0001_initial"), ('corporate', '0001_initial'),
] ]
operations = [ operations = [
migrations.AddField( migrations.AddField(
model_name="customer", model_name='customer',
name="default_discount", name='default_discount',
field=models.DecimalField(decimal_places=4, max_digits=7, null=True), field=models.DecimalField(decimal_places=4, max_digits=7, null=True),
), ),
] ]

View File

@@ -1,43 +1,35 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.16 on 2018-12-22 21:05 # Generated by Django 1.11.16 on 2018-12-22 21:05
from __future__ import unicode_literals
import django.db.models.deletion
from django.db import migrations, models from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [ dependencies = [
("corporate", "0002_customer_default_discount"), ('corporate', '0002_customer_default_discount'),
] ]
operations = [ operations = [
migrations.CreateModel( migrations.CreateModel(
name="CustomerPlan", name='CustomerPlan',
fields=[ fields=[
( ('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
"id", ('licenses', models.IntegerField()),
models.AutoField( ('automanage_licenses', models.BooleanField(default=False)),
auto_created=True, primary_key=True, serialize=False, verbose_name="ID" ('charge_automatically', models.BooleanField(default=False)),
), ('price_per_license', models.IntegerField(null=True)),
), ('fixed_price', models.IntegerField(null=True)),
("licenses", models.IntegerField()), ('discount', models.DecimalField(decimal_places=4, max_digits=6, null=True)),
("automanage_licenses", models.BooleanField(default=False)), ('billing_cycle_anchor', models.DateTimeField()),
("charge_automatically", models.BooleanField(default=False)), ('billing_schedule', models.SmallIntegerField()),
("price_per_license", models.IntegerField(null=True)), ('billed_through', models.DateTimeField()),
("fixed_price", models.IntegerField(null=True)), ('next_billing_date', models.DateTimeField(db_index=True)),
("discount", models.DecimalField(decimal_places=4, max_digits=6, null=True)), ('tier', models.SmallIntegerField()),
("billing_cycle_anchor", models.DateTimeField()), ('status', models.SmallIntegerField(default=1)),
("billing_schedule", models.SmallIntegerField()), ('customer', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='corporate.Customer')),
("billed_through", models.DateTimeField()),
("next_billing_date", models.DateTimeField(db_index=True)),
("tier", models.SmallIntegerField()),
("status", models.SmallIntegerField(default=1)),
(
"customer",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="corporate.Customer"
),
),
], ],
), ),
] ]

View File

@@ -1,35 +1,27 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.18 on 2019-01-19 05:01 # Generated by Django 1.11.18 on 2019-01-19 05:01
from __future__ import unicode_literals
import django.db.models.deletion
from django.db import migrations, models from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [ dependencies = [
("corporate", "0003_customerplan"), ('corporate', '0003_customerplan'),
] ]
operations = [ operations = [
migrations.CreateModel( migrations.CreateModel(
name="LicenseLedger", name='LicenseLedger',
fields=[ fields=[
( ('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
"id", ('is_renewal', models.BooleanField(default=False)),
models.AutoField( ('event_time', models.DateTimeField()),
auto_created=True, primary_key=True, serialize=False, verbose_name="ID" ('licenses', models.IntegerField()),
), ('licenses_at_next_renewal', models.IntegerField(null=True)),
), ('plan', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='corporate.CustomerPlan')),
("is_renewal", models.BooleanField(default=False)),
("event_time", models.DateTimeField()),
("licenses", models.IntegerField()),
("licenses_at_next_renewal", models.IntegerField(null=True)),
(
"plan",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE, to="corporate.CustomerPlan"
),
),
], ],
), ),
] ]

View File

@@ -1,38 +1,35 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.18 on 2019-01-28 13:04 # Generated by Django 1.11.18 on 2019-01-28 13:04
from __future__ import unicode_literals
import django.db.models.deletion
from django.db import migrations, models from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [ dependencies = [
("corporate", "0004_licenseledger"), ('corporate', '0004_licenseledger'),
] ]
operations = [ operations = [
migrations.RenameField( migrations.RenameField(
model_name="customerplan", model_name='customerplan',
old_name="next_billing_date", old_name='next_billing_date',
new_name="next_invoice_date", new_name='next_invoice_date',
), ),
migrations.RemoveField( migrations.RemoveField(
model_name="customerplan", model_name='customerplan',
name="billed_through", name='billed_through',
), ),
migrations.AddField( migrations.AddField(
model_name="customerplan", model_name='customerplan',
name="invoiced_through", name='invoiced_through',
field=models.ForeignKey( field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, related_name='+', to='corporate.LicenseLedger'),
null=True,
on_delete=django.db.models.deletion.CASCADE,
related_name="+",
to="corporate.LicenseLedger",
),
), ),
migrations.AddField( migrations.AddField(
model_name="customerplan", model_name='customerplan',
name="invoicing_status", name='invoicing_status',
field=models.SmallIntegerField(default=1), field=models.SmallIntegerField(default=1),
), ),
] ]

View File

@@ -1,4 +1,6 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.18 on 2019-01-29 01:46 # Generated by Django 1.11.18 on 2019-01-29 01:46
from __future__ import unicode_literals
from django.db import migrations, models from django.db import migrations, models
@@ -6,13 +8,13 @@ from django.db import migrations, models
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [ dependencies = [
("corporate", "0005_customerplan_invoicing"), ('corporate', '0005_customerplan_invoicing'),
] ]
operations = [ operations = [
migrations.AlterField( migrations.AlterField(
model_name="customer", model_name='customer',
name="stripe_customer_id", name='stripe_customer_id',
field=models.CharField(max_length=255, null=True, unique=True), field=models.CharField(max_length=255, null=True, unique=True),
), ),
] ]

View File

@@ -1,4 +1,6 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.18 on 2019-01-31 22:16 # Generated by Django 1.11.18 on 2019-01-31 22:16
from __future__ import unicode_literals
from django.db import migrations from django.db import migrations
@@ -6,33 +8,33 @@ from django.db import migrations
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [ dependencies = [
("corporate", "0006_nullable_stripe_customer_id"), ('corporate', '0006_nullable_stripe_customer_id'),
] ]
operations = [ operations = [
migrations.RemoveField( migrations.RemoveField(
model_name="billingprocessor", model_name='billingprocessor',
name="log_row", name='log_row',
), ),
migrations.RemoveField( migrations.RemoveField(
model_name="billingprocessor", model_name='billingprocessor',
name="realm", name='realm',
), ),
migrations.DeleteModel( migrations.DeleteModel(
name="Coupon", name='Coupon',
), ),
migrations.DeleteModel( migrations.DeleteModel(
name="Plan", name='Plan',
), ),
migrations.RemoveField( migrations.RemoveField(
model_name="customer", model_name='customer',
name="has_billing_relationship", name='has_billing_relationship',
), ),
migrations.RemoveField( migrations.RemoveField(
model_name="customerplan", model_name='customerplan',
name="licenses", name='licenses',
), ),
migrations.DeleteModel( migrations.DeleteModel(
name="BillingProcessor", name='BillingProcessor',
), ),
] ]

View File

@@ -1,18 +0,0 @@
# Generated by Django 1.11.20 on 2019-04-11 00:45
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("corporate", "0007_remove_deprecated_fields"),
]
operations = [
migrations.AlterField(
model_name="customerplan",
name="next_invoice_date",
field=models.DateTimeField(db_index=True, null=True),
),
]

View File

@@ -1,18 +0,0 @@
# Generated by Django 2.2.13 on 2020-06-09 12:09
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("corporate", "0008_nullable_next_invoice_date"),
]
operations = [
migrations.AddField(
model_name="customer",
name="sponsorship_pending",
field=models.BooleanField(default=False),
),
]

View File

@@ -7,107 +7,64 @@ from django.db.models import CASCADE
from zerver.models import Realm from zerver.models import Realm
class Customer(models.Model): class Customer(models.Model):
realm: Realm = models.OneToOneField(Realm, on_delete=CASCADE) realm = models.OneToOneField(Realm, on_delete=CASCADE) # type: Realm
stripe_customer_id: str = models.CharField(max_length=255, null=True, unique=True) stripe_customer_id = models.CharField(max_length=255, null=True, unique=True) # type: str
sponsorship_pending: bool = models.BooleanField(default=False)
# A percentage, like 85. # A percentage, like 85.
default_discount: Optional[Decimal] = models.DecimalField( default_discount = models.DecimalField(decimal_places=4, max_digits=7, null=True) # type: Optional[Decimal]
decimal_places=4, max_digits=7, null=True
)
def __str__(self) -> str: def __str__(self) -> str:
return f"<Customer {self.realm} {self.stripe_customer_id}>" return "<Customer %s %s>" % (self.realm, self.stripe_customer_id)
def get_customer_by_realm(realm: Realm) -> Optional[Customer]:
return Customer.objects.filter(realm=realm).first()
class CustomerPlan(models.Model): class CustomerPlan(models.Model):
customer: Customer = models.ForeignKey(Customer, on_delete=CASCADE) customer = models.ForeignKey(Customer, on_delete=CASCADE) # type: Customer
automanage_licenses: bool = models.BooleanField(default=False) automanage_licenses = models.BooleanField(default=False) # type: bool
charge_automatically: bool = models.BooleanField(default=False) charge_automatically = models.BooleanField(default=False) # type: bool
# Both of these are in cents. Exactly one of price_per_license or # Both of these are in cents. Exactly one of price_per_license or
# fixed_price should be set. fixed_price is only for manual deals, and # fixed_price should be set. fixed_price is only for manual deals, and
# can't be set via the self-serve billing system. # can't be set via the self-serve billing system.
price_per_license: Optional[int] = models.IntegerField(null=True) price_per_license = models.IntegerField(null=True) # type: Optional[int]
fixed_price: Optional[int] = models.IntegerField(null=True) fixed_price = models.IntegerField(null=True) # type: Optional[int]
# Discount that was applied. For display purposes only. # Discount that was applied. For display purposes only.
discount: Optional[Decimal] = models.DecimalField(decimal_places=4, max_digits=6, null=True) discount = models.DecimalField(decimal_places=4, max_digits=6, null=True) # type: Optional[Decimal]
billing_cycle_anchor: datetime.datetime = models.DateTimeField() billing_cycle_anchor = models.DateTimeField() # type: datetime.datetime
ANNUAL = 1 ANNUAL = 1
MONTHLY = 2 MONTHLY = 2
billing_schedule: int = models.SmallIntegerField() billing_schedule = models.SmallIntegerField() # type: int
next_invoice_date: Optional[datetime.datetime] = models.DateTimeField(db_index=True, null=True) next_invoice_date = models.DateTimeField(db_index=True) # type: datetime.datetime
invoiced_through: Optional["LicenseLedger"] = models.ForeignKey( invoiced_through = models.ForeignKey(
"LicenseLedger", null=True, on_delete=CASCADE, related_name="+" 'LicenseLedger', null=True, on_delete=CASCADE, related_name='+') # type: Optional[LicenseLedger]
)
DONE = 1 DONE = 1
STARTED = 2 STARTED = 2
INITIAL_INVOICE_TO_BE_SENT = 3 invoicing_status = models.SmallIntegerField(default=DONE) # type: int
invoicing_status: int = models.SmallIntegerField(default=DONE)
STANDARD = 1 STANDARD = 1
PLUS = 2 # not available through self-serve signup PLUS = 2 # not available through self-serve signup
ENTERPRISE = 10 ENTERPRISE = 10
tier: int = models.SmallIntegerField() tier = models.SmallIntegerField() # type: int
ACTIVE = 1 ACTIVE = 1
DOWNGRADE_AT_END_OF_CYCLE = 2 ENDED = 2
FREE_TRIAL = 3 NEVER_STARTED = 3
SWITCH_TO_ANNUAL_AT_END_OF_CYCLE = 4 # You can only have 1 active subscription at a time
# "Live" plans should have a value < LIVE_STATUS_THRESHOLD. status = models.SmallIntegerField(default=ACTIVE) # type: int
# There should be at most one live plan per customer.
LIVE_STATUS_THRESHOLD = 10
ENDED = 11
NEVER_STARTED = 12
status: int = models.SmallIntegerField(default=ACTIVE)
# TODO maybe override setattr to ensure billing_cycle_anchor, etc are immutable # TODO maybe override setattr to ensure billing_cycle_anchor, etc are immutable
@property def get_active_plan(customer: Customer) -> Optional[CustomerPlan]:
def name(self) -> str: return CustomerPlan.objects.filter(customer=customer, status=CustomerPlan.ACTIVE).first()
return {
CustomerPlan.STANDARD: "Zulip Standard",
CustomerPlan.PLUS: "Zulip Plus",
CustomerPlan.ENTERPRISE: "Zulip Enterprise",
}[self.tier]
def get_plan_status_as_text(self) -> str:
return {
self.ACTIVE: "Active",
self.DOWNGRADE_AT_END_OF_CYCLE: "Scheduled for downgrade at end of cycle",
self.FREE_TRIAL: "Free trial",
self.ENDED: "Ended",
self.NEVER_STARTED: "Never started",
}[self.status]
def get_current_plan_by_customer(customer: Customer) -> Optional[CustomerPlan]:
return CustomerPlan.objects.filter(
customer=customer, status__lt=CustomerPlan.LIVE_STATUS_THRESHOLD
).first()
def get_current_plan_by_realm(realm: Realm) -> Optional[CustomerPlan]:
customer = get_customer_by_realm(realm)
if customer is None:
return None
return get_current_plan_by_customer(customer)
class LicenseLedger(models.Model): class LicenseLedger(models.Model):
plan: CustomerPlan = models.ForeignKey(CustomerPlan, on_delete=CASCADE) plan = models.ForeignKey(CustomerPlan, on_delete=CASCADE) # type: CustomerPlan
# Also True for the initial upgrade. # Also True for the initial upgrade.
is_renewal: bool = models.BooleanField(default=False) is_renewal = models.BooleanField(default=False) # type: bool
event_time: datetime.datetime = models.DateTimeField() event_time = models.DateTimeField() # type: datetime.datetime
licenses: int = models.IntegerField() licenses = models.IntegerField() # type: int
# None means the plan does not automatically renew. # None means the plan does not automatically renew.
# 0 means the plan has been explicitly downgraded.
# This cannot be None if plan.automanage_licenses. # This cannot be None if plan.automanage_licenses.
licenses_at_next_renewal: Optional[int] = models.IntegerField(null=True) licenses_at_next_renewal = models.IntegerField(null=True) # type: Optional[int]

View File

@@ -1,25 +1,10 @@
{ {
"amount": 7200, "amount": 7200,
"amount_captured": 7200,
"amount_refunded": 0, "amount_refunded": 0,
"application": null, "application": null,
"application_fee": null, "application_fee": null,
"application_fee_amount": null, "application_fee_amount": null,
"balance_transaction": "txn_NORMALIZED00000000000001", "balance_transaction": "txn_NORMALIZED00000000000001",
"billing_details": {
"address": {
"city": "Pacific",
"country": "United States",
"line1": "Under the sea,",
"line2": null,
"postal_code": "33333",
"state": null
},
"email": null,
"name": "Ada Starr",
"phone": null
},
"calculated_statement_descriptor": "ZULIP STANDARD",
"captured": true, "captured": true,
"created": 1000000000, "created": 1000000000,
"currency": "usd", "currency": "usd",
@@ -27,7 +12,6 @@
"description": "Upgrade to Zulip Standard, $12.0 x 6", "description": "Upgrade to Zulip Standard, $12.0 x 6",
"destination": null, "destination": null,
"dispute": null, "dispute": null,
"disputed": false,
"failure_code": null, "failure_code": null,
"failure_message": null, "failure_message": null,
"fraud_details": {}, "fraud_details": {},
@@ -42,34 +26,12 @@
"network_status": "approved_by_network", "network_status": "approved_by_network",
"reason": null, "reason": null,
"risk_level": "normal", "risk_level": "normal",
"risk_score": 0, "risk_score": 00,
"seller_message": "Payment complete.", "seller_message": "Payment complete.",
"type": "authorized" "type": "authorized"
}, },
"paid": true, "paid": true,
"payment_intent": null, "payment_intent": null,
"payment_method": "card_NORMALIZED00000000000001",
"payment_method_details": {
"card": {
"brand": "visa",
"checks": {
"address_line1_check": "pass",
"address_postal_code_check": "pass",
"cvc_check": "pass"
},
"country": "US",
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"installments": null,
"last4": "4242",
"network": "visa",
"three_d_secure": null,
"wallet": null
},
"type": "card"
},
"receipt_email": "hamlet@zulip.com", "receipt_email": "hamlet@zulip.com",
"receipt_number": null, "receipt_number": null,
"receipt_url": "https://pay.stripe.com/receipts/acct_NORMALIZED000001/ch_NORMALIZED00000000000001/rcpt_NORMALIZED000000000000000000001", "receipt_url": "https://pay.stripe.com/receipts/acct_NORMALIZED000001/ch_NORMALIZED00000000000001/rcpt_NORMALIZED000000000000000000001",
@@ -110,7 +72,6 @@
}, },
"source_transfer": null, "source_transfer": null,
"statement_descriptor": "Zulip Standard", "statement_descriptor": "Zulip Standard",
"statement_descriptor_suffix": null,
"status": "succeeded", "status": "succeeded",
"transfer_data": null, "transfer_data": null,
"transfer_group": null "transfer_group": null

View File

@@ -1,25 +1,10 @@
{ {
"amount": 36000, "amount": 36000,
"amount_captured": 36000,
"amount_refunded": 0, "amount_refunded": 0,
"application": null, "application": null,
"application_fee": null, "application_fee": null,
"application_fee_amount": null, "application_fee_amount": null,
"balance_transaction": "txn_NORMALIZED00000000000002", "balance_transaction": "txn_NORMALIZED00000000000002",
"billing_details": {
"address": {
"city": "Pacific",
"country": "United States",
"line1": "Under the sea,",
"line2": null,
"postal_code": "33333",
"state": null
},
"email": null,
"name": "Ada Starr",
"phone": null
},
"calculated_statement_descriptor": "ZULIP STANDARD",
"captured": true, "captured": true,
"created": 1000000000, "created": 1000000000,
"currency": "usd", "currency": "usd",
@@ -27,7 +12,6 @@
"description": "Upgrade to Zulip Standard, $60.0 x 6", "description": "Upgrade to Zulip Standard, $60.0 x 6",
"destination": null, "destination": null,
"dispute": null, "dispute": null,
"disputed": false,
"failure_code": null, "failure_code": null,
"failure_message": null, "failure_message": null,
"fraud_details": {}, "fraud_details": {},
@@ -42,34 +26,12 @@
"network_status": "approved_by_network", "network_status": "approved_by_network",
"reason": null, "reason": null,
"risk_level": "normal", "risk_level": "normal",
"risk_score": 0, "risk_score": 00,
"seller_message": "Payment complete.", "seller_message": "Payment complete.",
"type": "authorized" "type": "authorized"
}, },
"paid": true, "paid": true,
"payment_intent": null, "payment_intent": null,
"payment_method": "card_NORMALIZED00000000000002",
"payment_method_details": {
"card": {
"brand": "visa",
"checks": {
"address_line1_check": "pass",
"address_postal_code_check": "pass",
"cvc_check": "pass"
},
"country": "US",
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"installments": null,
"last4": "4242",
"network": "visa",
"three_d_secure": null,
"wallet": null
},
"type": "card"
},
"receipt_email": "hamlet@zulip.com", "receipt_email": "hamlet@zulip.com",
"receipt_number": null, "receipt_number": null,
"receipt_url": "https://pay.stripe.com/receipts/acct_NORMALIZED000001/ch_NORMALIZED00000000000002/rcpt_NORMALIZED000000000000000000002", "receipt_url": "https://pay.stripe.com/receipts/acct_NORMALIZED000001/ch_NORMALIZED00000000000002/rcpt_NORMALIZED000000000000000000002",
@@ -110,7 +72,6 @@
}, },
"source_transfer": null, "source_transfer": null,
"statement_descriptor": "Zulip Standard", "statement_descriptor": "Zulip Standard",
"statement_descriptor_suffix": null,
"status": "succeeded", "status": "succeeded",
"transfer_data": null, "transfer_data": null,
"transfer_group": null "transfer_group": null

View File

@@ -2,26 +2,11 @@
"data": [ "data": [
{ {
"amount": 7200, "amount": 7200,
"amount_captured": 7200,
"amount_refunded": 0, "amount_refunded": 0,
"application": null, "application": null,
"application_fee": null, "application_fee": null,
"application_fee_amount": null, "application_fee_amount": null,
"balance_transaction": "txn_NORMALIZED00000000000001", "balance_transaction": "txn_NORMALIZED00000000000001",
"billing_details": {
"address": {
"city": "Pacific",
"country": "United States",
"line1": "Under the sea,",
"line2": null,
"postal_code": "33333",
"state": null
},
"email": null,
"name": "Ada Starr",
"phone": null
},
"calculated_statement_descriptor": "ZULIP STANDARD",
"captured": true, "captured": true,
"created": 1000000000, "created": 1000000000,
"currency": "usd", "currency": "usd",
@@ -29,7 +14,6 @@
"description": "Upgrade to Zulip Standard, $12.0 x 6", "description": "Upgrade to Zulip Standard, $12.0 x 6",
"destination": null, "destination": null,
"dispute": null, "dispute": null,
"disputed": false,
"failure_code": null, "failure_code": null,
"failure_message": null, "failure_message": null,
"fraud_details": {}, "fraud_details": {},
@@ -44,34 +28,12 @@
"network_status": "approved_by_network", "network_status": "approved_by_network",
"reason": null, "reason": null,
"risk_level": "normal", "risk_level": "normal",
"risk_score": 0, "risk_score": 00,
"seller_message": "Payment complete.", "seller_message": "Payment complete.",
"type": "authorized" "type": "authorized"
}, },
"paid": true, "paid": true,
"payment_intent": null, "payment_intent": null,
"payment_method": "card_NORMALIZED00000000000001",
"payment_method_details": {
"card": {
"brand": "visa",
"checks": {
"address_line1_check": "pass",
"address_postal_code_check": "pass",
"cvc_check": "pass"
},
"country": "US",
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"installments": null,
"last4": "4242",
"network": "visa",
"three_d_secure": null,
"wallet": null
},
"type": "card"
},
"receipt_email": "hamlet@zulip.com", "receipt_email": "hamlet@zulip.com",
"receipt_number": null, "receipt_number": null,
"receipt_url": "https://pay.stripe.com/receipts/acct_NORMALIZED000001/ch_NORMALIZED00000000000001/rcpt_NORMALIZED000000000000000000001", "receipt_url": "https://pay.stripe.com/receipts/acct_NORMALIZED000001/ch_NORMALIZED00000000000001/rcpt_NORMALIZED000000000000000000001",
@@ -106,7 +68,6 @@
}, },
"source_transfer": null, "source_transfer": null,
"statement_descriptor": "Zulip Standard", "statement_descriptor": "Zulip Standard",
"statement_descriptor_suffix": null,
"status": "succeeded", "status": "succeeded",
"transfer_data": null, "transfer_data": null,
"transfer_group": null "transfer_group": null

View File

@@ -2,26 +2,11 @@
"data": [ "data": [
{ {
"amount": 36000, "amount": 36000,
"amount_captured": 36000,
"amount_refunded": 0, "amount_refunded": 0,
"application": null, "application": null,
"application_fee": null, "application_fee": null,
"application_fee_amount": null, "application_fee_amount": null,
"balance_transaction": "txn_NORMALIZED00000000000002", "balance_transaction": "txn_NORMALIZED00000000000002",
"billing_details": {
"address": {
"city": "Pacific",
"country": "United States",
"line1": "Under the sea,",
"line2": null,
"postal_code": "33333",
"state": null
},
"email": null,
"name": "Ada Starr",
"phone": null
},
"calculated_statement_descriptor": "ZULIP STANDARD",
"captured": true, "captured": true,
"created": 1000000000, "created": 1000000000,
"currency": "usd", "currency": "usd",
@@ -29,7 +14,6 @@
"description": "Upgrade to Zulip Standard, $60.0 x 6", "description": "Upgrade to Zulip Standard, $60.0 x 6",
"destination": null, "destination": null,
"dispute": null, "dispute": null,
"disputed": false,
"failure_code": null, "failure_code": null,
"failure_message": null, "failure_message": null,
"fraud_details": {}, "fraud_details": {},
@@ -44,34 +28,12 @@
"network_status": "approved_by_network", "network_status": "approved_by_network",
"reason": null, "reason": null,
"risk_level": "normal", "risk_level": "normal",
"risk_score": 0, "risk_score": 00,
"seller_message": "Payment complete.", "seller_message": "Payment complete.",
"type": "authorized" "type": "authorized"
}, },
"paid": true, "paid": true,
"payment_intent": null, "payment_intent": null,
"payment_method": "card_NORMALIZED00000000000002",
"payment_method_details": {
"card": {
"brand": "visa",
"checks": {
"address_line1_check": "pass",
"address_postal_code_check": "pass",
"cvc_check": "pass"
},
"country": "US",
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"installments": null,
"last4": "4242",
"network": "visa",
"three_d_secure": null,
"wallet": null
},
"type": "card"
},
"receipt_email": "hamlet@zulip.com", "receipt_email": "hamlet@zulip.com",
"receipt_number": null, "receipt_number": null,
"receipt_url": "https://pay.stripe.com/receipts/acct_NORMALIZED000001/ch_NORMALIZED00000000000002/rcpt_NORMALIZED000000000000000000002", "receipt_url": "https://pay.stripe.com/receipts/acct_NORMALIZED000001/ch_NORMALIZED00000000000002/rcpt_NORMALIZED000000000000000000002",
@@ -106,33 +68,17 @@
}, },
"source_transfer": null, "source_transfer": null,
"statement_descriptor": "Zulip Standard", "statement_descriptor": "Zulip Standard",
"statement_descriptor_suffix": null,
"status": "succeeded", "status": "succeeded",
"transfer_data": null, "transfer_data": null,
"transfer_group": null "transfer_group": null
}, },
{ {
"amount": 7200, "amount": 7200,
"amount_captured": 7200,
"amount_refunded": 0, "amount_refunded": 0,
"application": null, "application": null,
"application_fee": null, "application_fee": null,
"application_fee_amount": null, "application_fee_amount": null,
"balance_transaction": "txn_NORMALIZED00000000000001", "balance_transaction": "txn_NORMALIZED00000000000001",
"billing_details": {
"address": {
"city": "Pacific",
"country": "United States",
"line1": "Under the sea,",
"line2": null,
"postal_code": "33333",
"state": null
},
"email": null,
"name": "Ada Starr",
"phone": null
},
"calculated_statement_descriptor": "ZULIP STANDARD",
"captured": true, "captured": true,
"created": 1000000000, "created": 1000000000,
"currency": "usd", "currency": "usd",
@@ -140,7 +86,6 @@
"description": "Upgrade to Zulip Standard, $12.0 x 6", "description": "Upgrade to Zulip Standard, $12.0 x 6",
"destination": null, "destination": null,
"dispute": null, "dispute": null,
"disputed": false,
"failure_code": null, "failure_code": null,
"failure_message": null, "failure_message": null,
"fraud_details": {}, "fraud_details": {},
@@ -155,34 +100,12 @@
"network_status": "approved_by_network", "network_status": "approved_by_network",
"reason": null, "reason": null,
"risk_level": "normal", "risk_level": "normal",
"risk_score": 0, "risk_score": 00,
"seller_message": "Payment complete.", "seller_message": "Payment complete.",
"type": "authorized" "type": "authorized"
}, },
"paid": true, "paid": true,
"payment_intent": null, "payment_intent": null,
"payment_method": "card_NORMALIZED00000000000001",
"payment_method_details": {
"card": {
"brand": "visa",
"checks": {
"address_line1_check": "pass",
"address_postal_code_check": "pass",
"cvc_check": "pass"
},
"country": "US",
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"installments": null,
"last4": "4242",
"network": "visa",
"three_d_secure": null,
"wallet": null
},
"type": "card"
},
"receipt_email": "hamlet@zulip.com", "receipt_email": "hamlet@zulip.com",
"receipt_number": null, "receipt_number": null,
"receipt_url": "https://pay.stripe.com/receipts/acct_NORMALIZED000001/ch_NORMALIZED00000000000001/rcpt_NORMALIZED000000000000000000001", "receipt_url": "https://pay.stripe.com/receipts/acct_NORMALIZED000001/ch_NORMALIZED00000000000001/rcpt_NORMALIZED000000000000000000001",
@@ -217,7 +140,6 @@
}, },
"source_transfer": null, "source_transfer": null,
"statement_descriptor": "Zulip Standard", "statement_descriptor": "Zulip Standard",
"statement_descriptor_suffix": null,
"status": "succeeded", "status": "succeeded",
"transfer_data": null, "transfer_data": null,
"transfer_group": null "transfer_group": null

View File

@@ -1,7 +1,5 @@
{ {
"account_balance": 0, "account_balance": 0,
"address": null,
"balance": 0,
"created": 1000000000, "created": 1000000000,
"currency": null, "currency": null,
"default_source": "card_NORMALIZED00000000000001", "default_source": "card_NORMALIZED00000000000001",
@@ -13,7 +11,6 @@
"invoice_prefix": "NORMA01", "invoice_prefix": "NORMA01",
"invoice_settings": { "invoice_settings": {
"custom_fields": null, "custom_fields": null,
"default_payment_method": null,
"footer": null "footer": null
}, },
"livemode": false, "livemode": false,
@@ -21,11 +18,7 @@
"realm_id": "1", "realm_id": "1",
"realm_str": "zulip" "realm_str": "zulip"
}, },
"name": null,
"next_invoice_sequence": 1,
"object": "customer", "object": "customer",
"phone": null,
"preferred_locales": [],
"shipping": null, "shipping": null,
"sources": { "sources": {
"data": [ "data": [
@@ -67,14 +60,6 @@
"total_count": 0, "total_count": 0,
"url": "/v1/customers/cus_NORMALIZED0001/subscriptions" "url": "/v1/customers/cus_NORMALIZED0001/subscriptions"
}, },
"tax_exempt": "none",
"tax_ids": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/customers/cus_NORMALIZED0001/tax_ids"
},
"tax_info": null, "tax_info": null,
"tax_info_verification": null "tax_info_verification": null
} }

View File

@@ -1,7 +1,5 @@
{ {
"account_balance": 0, "account_balance": 0,
"address": null,
"balance": 0,
"created": 1000000000, "created": 1000000000,
"currency": "usd", "currency": "usd",
"default_source": { "default_source": {
@@ -37,7 +35,6 @@
"invoice_prefix": "NORMA01", "invoice_prefix": "NORMA01",
"invoice_settings": { "invoice_settings": {
"custom_fields": null, "custom_fields": null,
"default_payment_method": null,
"footer": null "footer": null
}, },
"livemode": false, "livemode": false,
@@ -45,11 +42,7 @@
"realm_id": "1", "realm_id": "1",
"realm_str": "zulip" "realm_str": "zulip"
}, },
"name": null,
"next_invoice_sequence": 2,
"object": "customer", "object": "customer",
"phone": null,
"preferred_locales": [],
"shipping": null, "shipping": null,
"sources": { "sources": {
"data": [ "data": [
@@ -91,14 +84,6 @@
"total_count": 0, "total_count": 0,
"url": "/v1/customers/cus_NORMALIZED0001/subscriptions" "url": "/v1/customers/cus_NORMALIZED0001/subscriptions"
}, },
"tax_exempt": "none",
"tax_ids": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/customers/cus_NORMALIZED0001/tax_ids"
},
"tax_info": null, "tax_info": null,
"tax_info_verification": null "tax_info_verification": null
} }

View File

@@ -1,7 +1,5 @@
{ {
"account_balance": 0, "account_balance": 0,
"address": null,
"balance": 0,
"created": 1000000000, "created": 1000000000,
"currency": "usd", "currency": "usd",
"default_source": "card_NORMALIZED00000000000002", "default_source": "card_NORMALIZED00000000000002",
@@ -13,7 +11,6 @@
"invoice_prefix": "NORMA01", "invoice_prefix": "NORMA01",
"invoice_settings": { "invoice_settings": {
"custom_fields": null, "custom_fields": null,
"default_payment_method": null,
"footer": null "footer": null
}, },
"livemode": false, "livemode": false,
@@ -21,11 +18,7 @@
"realm_id": "1", "realm_id": "1",
"realm_str": "zulip" "realm_str": "zulip"
}, },
"name": null,
"next_invoice_sequence": 2,
"object": "customer", "object": "customer",
"phone": null,
"preferred_locales": [],
"shipping": null, "shipping": null,
"sources": { "sources": {
"data": [ "data": [
@@ -67,14 +60,6 @@
"total_count": 0, "total_count": 0,
"url": "/v1/customers/cus_NORMALIZED0001/subscriptions" "url": "/v1/customers/cus_NORMALIZED0001/subscriptions"
}, },
"tax_exempt": "none",
"tax_ids": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/customers/cus_NORMALIZED0001/tax_ids"
},
"tax_info": null, "tax_info": null,
"tax_info_verification": null "tax_info_verification": null
} }

View File

@@ -1,7 +1,4 @@
{ {
"account_country": "US",
"account_name": "Vishnu Test",
"account_tax_ids": null,
"amount_due": 0, "amount_due": 0,
"amount_paid": 0, "amount_paid": 0,
"amount_remaining": 0, "amount_remaining": 0,
@@ -12,25 +9,13 @@
"billing": "charge_automatically", "billing": "charge_automatically",
"billing_reason": "manual", "billing_reason": "manual",
"charge": null, "charge": null,
"collection_method": "charge_automatically",
"created": 1000000000,
"currency": "usd", "currency": "usd",
"custom_fields": null, "custom_fields": null,
"customer": "cus_NORMALIZED0001", "customer": "cus_NORMALIZED0001",
"customer_address": null,
"customer_email": "hamlet@zulip.com",
"customer_name": null,
"customer_phone": null,
"customer_shipping": null,
"customer_tax_exempt": "none",
"customer_tax_ids": [],
"date": 1000000000, "date": 1000000000,
"default_payment_method": null,
"default_source": null, "default_source": null,
"default_tax_rates": [], "description": "",
"description": null,
"discount": null, "discount": null,
"discounts": [],
"due_date": null, "due_date": null,
"ending_balance": null, "ending_balance": null,
"finalized_at": null, "finalized_at": null,
@@ -38,16 +23,13 @@
"hosted_invoice_url": null, "hosted_invoice_url": null,
"id": "in_NORMALIZED00000000000001", "id": "in_NORMALIZED00000000000001",
"invoice_pdf": null, "invoice_pdf": null,
"last_finalization_error": null,
"lines": { "lines": {
"data": [ "data": [
{ {
"amount": 7200, "amount": 7200,
"currency": "usd", "currency": "usd",
"description": "Zulip Standard", "description": "Zulip Standard",
"discount_amounts": [],
"discountable": false, "discountable": false,
"discounts": [],
"id": "ii_NORMALIZED00000000000001", "id": "ii_NORMALIZED00000000000001",
"invoice_item": "ii_NORMALIZED00000000000001", "invoice_item": "ii_NORMALIZED00000000000001",
"livemode": false, "livemode": false,
@@ -58,40 +40,16 @@
"start": 1000000000 "start": 1000000000
}, },
"plan": null, "plan": null,
"price": {
"active": false,
"billing_scheme": "per_unit",
"created": 1000000000,
"currency": "usd",
"id": "price_1HufhsD2X8vgpBNGtyNs4AI9",
"livemode": false,
"lookup_key": null,
"metadata": {},
"nickname": null,
"object": "price",
"product": "prod_IVh67i06KRHwdX",
"recurring": null,
"tiers_mode": null,
"transform_quantity": null,
"type": "one_time",
"unit_amount": 1200,
"unit_amount_decimal": "1200"
},
"proration": false, "proration": false,
"quantity": 6, "quantity": 6,
"subscription": null, "subscription": null,
"tax_amounts": [], "type": "invoiceitem"
"tax_rates": [],
"type": "invoiceitem",
"unique_id": "il_1HufhsD2X8vgpBNGtA08rM3i"
}, },
{ {
"amount": -7200, "amount": -7200,
"currency": "usd", "currency": "usd",
"description": "Payment (Card ending in 4242)", "description": "Payment (Card ending in 4242)",
"discount_amounts": [],
"discountable": false, "discountable": false,
"discounts": [],
"id": "ii_NORMALIZED00000000000002", "id": "ii_NORMALIZED00000000000002",
"invoice_item": "ii_NORMALIZED00000000000002", "invoice_item": "ii_NORMALIZED00000000000002",
"livemode": false, "livemode": false,
@@ -102,32 +60,10 @@
"start": 1000000000 "start": 1000000000
}, },
"plan": null, "plan": null,
"price": {
"active": false,
"billing_scheme": "per_unit",
"created": 1000000000,
"currency": "usd",
"id": "price_1HufhrD2X8vgpBNGD9sFn8tJ",
"livemode": false,
"lookup_key": null,
"metadata": {},
"nickname": null,
"object": "price",
"product": "prod_IVh6pGP4ldOFFV",
"recurring": null,
"tiers_mode": null,
"transform_quantity": null,
"type": "one_time",
"unit_amount": -7200,
"unit_amount_decimal": "-7200"
},
"proration": false, "proration": false,
"quantity": 1, "quantity": 1,
"subscription": null, "subscription": null,
"tax_amounts": [], "type": "invoiceitem"
"tax_rates": [],
"type": "invoiceitem",
"unique_id": "il_1HufhrD2X8vgpBNGf4QcWhh8"
} }
], ],
"has_more": false, "has_more": false,
@@ -141,28 +77,16 @@
"number": "NORMALI-0001", "number": "NORMALI-0001",
"object": "invoice", "object": "invoice",
"paid": false, "paid": false,
"payment_intent": null,
"period_end": 1000000000, "period_end": 1000000000,
"period_start": 1000000000, "period_start": 1000000000,
"post_payment_credit_notes_amount": 0,
"pre_payment_credit_notes_amount": 0,
"receipt_number": null, "receipt_number": null,
"starting_balance": 0, "starting_balance": 0,
"statement_descriptor": "Zulip Standard", "statement_descriptor": "Zulip Standard",
"status": "draft", "status": "draft",
"status_transitions": {
"finalized_at": null,
"marked_uncollectible_at": null,
"paid_at": null,
"voided_at": null
},
"subscription": null, "subscription": null,
"subtotal": 0, "subtotal": 0,
"tax": null, "tax": 0,
"tax_percent": null, "tax_percent": null,
"total": 0, "total": 0,
"total_discount_amounts": [],
"total_tax_amounts": [],
"transfer_data": null,
"webhooks_delivered_at": null "webhooks_delivered_at": null
} }

Some files were not shown because too many files have changed in this diff Show More