Compare commits

..

163 Commits
3.1 ... 2.1.7

Author SHA1 Message Date
Tim Abbott
f86e22a443 Release Zulip Server 2.1.7. 2020-06-25 17:11:53 -07:00
Anders Kaseorg
bd55825ab8 CVE-2020-15070: Replace eval with ast.literal_eval.
This eval function performs the inverse of the implicit
stringification that’s implied by this type-incorrect assignment in
do_update_user_custom_profile_data_if_changed:

field_value.value = field['value']

We believe there’s sufficient validation for the data being passed to
this eval that it could only have been exploited by a PostgreSQL
administrator editing the database manually.

Signed-off-by: Anders Kaseorg <anders@zulip.com>
2020-06-25 17:02:32 -07:00
Anders Kaseorg
0a827064ba memcached: Change the default MEMCACHED_USERNAME to zulip@localhost.
This prevents memcached from automatically appending the hostname to
the username, which was a source of problems on servers where the
hostname was changed.

Signed-off-by: Anders Kaseorg <anders@zulip.com>
2020-06-19 20:21:54 -07:00
Tim Abbott
01902fa648 Release Zulip Server 2.1.6. 2020-06-17 00:27:37 -07:00
Tim Abbott
cbb9ea6b49 auth: Fix Python style not supported on Python 3.5.
This bug broke the 2.1.5 release on Ubuntu Xenial.
2020-06-17 00:24:42 -07:00
Tim Abbott
d163143f12 Release Zulip Server 2.1.5. 2020-06-16 23:16:06 -07:00
Tim Abbott
c21c8dcd95 CVE-2020-14215: Add migration to clear INVITED_AS_REALM_ADMIN.
This migration fixes any PreregistrationUser objects that might have
been already corrupted to have the administrator role by the buggy
original version of migration 0198_preregistrationuser_invited_as.

Since invitations that create new users as administrators are rare, it
is cleaner to just remove the role from all PreregistrationUser
objects than to filter for just those older invitation objects that
could have been corrupted by the original migration.
2020-06-16 23:16:06 -07:00
Tim Abbott
82d2960ad1 CVE-2020-14215: Fix migration 0198_preregistrationuser_invited_as.
This migration incorrectly swapped the role associated with invitation
objects between members and organization administrators, resulting in
most invitation objects that existed before the upgrade to Zulip
2.0.0-rc1 or later to be incorrectly administrator invitations.

Fixing the migration is safe and will help those installations
upgrading directly from 1.9.x to 2.1.5 or later.

A migration to fix the corrupted records will appear in an upcoming
commit.
2020-06-16 23:16:06 -07:00
Mateusz Mandera
fa07539016 CVE-2020-14215: Fix validation in PreregistrationUser queries.
The most import change here is the one in maybe_send_to_registration
codepath, as the insufficient validation there could lead to fetching
an expired PreregistrationUser that was invited as an administrator
admin even years ago, leading to this registration ending up in the
new user being a realm administrator.

Combined with the buggy migration in
0198_preregistrationuser_invited_as.py, this led to users incorrectly
joining as organizations administrators by accident.  But even without
that bug, this issue could have allowed a user who was invited as an
administrator but then had that invitation expire and then joined via
social authentication incorrectly join as an organization administrator.

The second change is in ConfirmationEmailWorker, where this wasn't a
security problem, but if the server was stopped for long enough, with
some invites to send out email for in the queue, then after starting it
up again, the queue worker would send out emails for invites that
had already expired.

Backported to the 2.1.x series by tabbott.
2020-06-16 23:16:06 -07:00
Tim Abbott
6d0c39fd7e CVE-2020-14194: Use noopener/noreferrer for external links.
We fixed the main issue of this form in CVE-2020-9444, but the audit
done at that time only included links found in rendered_markdown; this
change completes our audit for links with target=_blank anywhere in
the codebase.
2020-06-16 23:16:05 -07:00
Tim Abbott
2e2004b6c3 templates: Fix missing quoting of attributes in HTML templates.
This fixes a bundle of issues where we were missing "" around
attributes coming from variables.  In most cases, the variables were
integers or fixed constants from the Zulip codebase (E.g. the name of
an installed integration), but in at least one case it was
user-provided data that could potentially have security impact.
2020-06-16 23:12:41 -07:00
Anders Kaseorg
620e98860e auth: Accept next as POST parameter in POST requests.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
2020-06-16 23:12:41 -07:00
Anders Kaseorg
83380b4296 CVE-2020-12759: Fix reflected XSS vulnerability in Dropbox webhook.
Also check the challenge argument’s presence before using it.

Signed-off-by: Anders Kaseorg <anders@zulip.com>
2020-06-16 23:12:40 -07:00
arpit551
e88aac5105 provision: Rename --production-test-suite option in provision.
Since we use this option in our docker-zulip project also
so rather than using it as a test suite option we made it
more specific i.e. --build-release-tarball-only.
2020-06-07 11:19:25 -07:00
Tim Abbott
6046ea8014 settings: Fix fetching API key with password auth disabled.
To the extent that the previous logic worked, it relied on an unlikely
race where the click handler had been setup before.
2020-06-05 11:37:26 -07:00
Tim Abbott
ba8ee93fae help: Suggest restarting server during Slack import.
This reduces the risk of folks running into OOM kills when going
through the data import process on servers with a minimal 2GB of RAM.
2020-05-12 22:17:06 -07:00
Rohitt Vashishtha
e682ea189a slack-import: Update docs to reflect the removal of Slack legacy tokens.
This commit deatails how users can generate the new type of APi tokens
by creating a new slack app with the correct scopes specified.

Fixes #14963.
2020-05-12 22:17:06 -07:00
Tim Abbott
148ea9fe48 slack import: Fix DefaultStream import of deactivated #random.
If the #random channel in Slack is deactivated, we should follow
Zulip's data model of not allowing deactivated, default streams.

This had apparently happened in zulipchat.com for a few organizations,
resulting in weird exceptions trying to invite new users.
2020-05-12 22:17:06 -07:00
Rohitt Vashishtha
31a34836d3 slack-import: Downgrade Slack legacy-token check failure to warning.
Slack has disabled creation of legacy tokens, which means we have to use other
tokens for importing the data. Thus, we shouldn't throw an error if the token
doesn't match the legacy token format.

Since we do not have any other validation for those tokens yet, we log a warning
but still try to continue with the import assuming that the token has the right
scopes.

See https://api.slack.com/changelog/2020-02-legacy-test-token-creation-to-retire.
2020-05-12 22:17:06 -07:00
Anders Kaseorg
309266376e version: Update for Zulip Desktop v5.2.0.
Signed-off-by: Anders Kaseorg <anders@zulip.com>
2020-05-07 11:27:08 -07:00
pemontto
ef194171f7 puppet: Allow /etc/zulip to be a symlink.
This PR updates the puppet manifest to allow /etc/zulip to be a symlink. The current behaviour overwrites /etc/zulip if it is link to another directory, which is problematic with docker-zulip and 
in particular the `LINK_SETTINGS_TO_DATA` setting.
2020-04-17 12:45:25 -07:00
Tim Abbott
66fa35f5ac test_i18n: Update test for new translation string data. 2020-04-16 16:42:03 -07:00
Tim Abbott
2b95f54593 Release Zulip Server 2.1.4. 2020-04-16 15:37:51 -07:00
Tim Abbott
d41f06e8a9 docs: Deprecate support for Xenial and Stretch.
Also make sure our documentation for upgrading is reasonable for
Stretch => Buster.

Our reasoning for deprecating support for these releases is as follows:

* Ubuntu 16.04 Xenial reached desktop EOL last year; and will reach
  EOL on the server in about a year.

* Debian Stretch will each EOL in 2020 (the precise date is unclear in
  Debian's documentation, but based on past precedent it's in the next
  few months, perhaps July 2020).
  https://wiki.debian.org/DebianReleases#Production_Releases

* Both Ubuntu 16.04 and Debian Stretch use Python 3.5 as the system
  Python, which will reach EOL in September 2020 (and we're already
  seeing various third-party dependencies that we use drop support for
  them).

* While there is LTS support for these older releases, it's not clear it's
  going to be worth the added engineering effort for us to maintain EOL
  releases of the base OSes that we support.

* We (now) have clear upgrade instructions for moving to Debian Buster
  and Ubuntu 18.04.
2020-04-16 15:37:20 -07:00
Tim Abbott
d119e97755 i18n: Update translation data from Transifex. 2020-04-16 14:11:45 -07:00
Tim Abbott
5ea0d1d1e8 import: Make sure the internal realm is created before import.
This is critical for importing the very first realm into an empty
server, since in 27b15a9722, we changed
the model to create the internal realm when the first real realm would
be created, but neglected the data import code path.
2020-04-15 16:43:47 -07:00
Tim Abbott
fd66cfd93c upgrade-zulip: Remove tsearch-extras on upgrade.
We stopped using tsearch-extras in Zulip 2.1.0 after Anders figured
out how to achieve its goals with native postgres.  However, we never
did a `DROP EXTENSION` on systems thta had upgraded, which meant that
backups created on systems originally installed with Zulip 2.0.x and
older, and later upgraded to Zulip 2.1.x, could not be restored on
Zulip servers created with a fresh install of Zulip 2.1.x.

We can't do this with a normal database migration, because DROP
EXTENSION has to be done as the postgres user, so we add some custom
migration code in the upgrade-zulip-stage-2 tool.

It's safe to run this whenever tsearch_extras.control is installed because:
* Zulip is AFAIK the only software that ever used tsearch_extras.
* The package was only installed via puppet on production servers configured to
  run a local Zulip database.
* We'll only run this code once per system, because it removes the
  package and thus the control files.

Fixes #13612.
2020-04-15 15:18:53 -07:00
Anders Kaseorg
e76bab19a7 puppet: Fix puppet-lint warning.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
2020-04-08 13:07:07 -07:00
Vishnu KS
13532917ca team: Generate team page data using cron job.
This eliminates the contributors data as a possible source of
flakiness when installing Zulip from Git.

Fixes #14351.
2020-04-08 13:07:02 -07:00
Vishnu KS
b5c9a006f0 tools: Move duplicate_commits.json file to tools directory.
This is a prep commit for generating /team page data
using cron job. zerver/tests directory is not present in
production installation. So moving the file from the directory
tests to tools.
2020-04-08 12:56:55 -07:00
Vishnu KS
a2edd58b82 tools: Rename update-authors-json to fetch-contributor-data. 2020-04-08 12:54:57 -07:00
Tim Abbott
d22cb7d01f Release Zulip Server 2.1.3. 2020-04-01 13:35:31 -07:00
Anders Kaseorg
76ce370181 frontend: Defensively filter unsafe links that may come from bugdown.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
2020-04-01 13:35:31 -07:00
Anders Kaseorg
64856d858e CVE-2020-10935: Fix XSS vulnerability in local link rewriting.
Make sure rewrite_local_links_to_relative does not accidentally change
the meaning of links.

Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
2020-04-01 13:35:31 -07:00
Anders Kaseorg
c9796ba7f7 CVE-2020-9444: Prevent reverse tabnabbing attacks.
While we could fix this issue by changing the markdown processor,
doing so is not a robust solution, because even a momentary bug in the
markdown processor could allow cached messages that do not follow our
security policy.

This change ensures that even if our markdown processor has bugs that
result in rendered content that does not properly follow our policy of
using rel="noopener noreferrer" on links, we'll still do something
reasonable.

Co-authored-by: Tim Abbott <tabbott@zulipchat.com>
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
2020-04-01 13:35:31 -07:00
Tim Abbott
b21117954d CVE-2020-9445: Remove unused and insecure modal_link feature.
Zulip's modal_link markdown feature has not been used since 2017; it
was a hack used for a 2013-era tutorial feature and was never used
outside that use case.

Unfortunately, it's sloppy implementation was exposed in the markdown
processor for all users, not just the tutorial use case.

More importantly, it was buggy, in that it did not validate the link
using the standard validation approach used by our other code
interacting with links.

The right solution is simply to remove it.
2020-04-01 13:35:31 -07:00
Mateusz Mandera
59f5ca713f auth: Fix error on startup in django-two-factor-auth in Django 2.1+.
https://github.com/Bouke/django-two-factor-auth/issues/297
This setting was added in 1.9 version of the app and can be used
harmleslly in our current Django 1.11-based code and will prevent an
error on Django 2.1+ when we move there.
2020-04-01 13:35:31 -07:00
Tim Abbott
67da8e8431 version: Move minimum desktop version configuration to version.py.
This makes it relatively easy for a system administrator to
temporarily override these values after a desktop app security
release that they want to ensure all of their users take.

We're not putting this in settings, since we don't want to encourage
accidental long-term overrides of these important-to-security values.
2020-04-01 13:23:53 -07:00
Mateusz Mandera
b79fbf9239 requirements: Bump python-social-auth to 3.3.2. 2020-03-26 23:35:56 +00:00
Tom Daff
f1f937e4ea monitoring: Fix check-rabbitmq-consumers.
Missing commas in the definition of all the queues to check meant that it would be looking for queues with concatenated names, rather than the correct ones. Added the commas.
2020-03-25 17:19:55 -07:00
Chris Heald
68628149db integrations: Add AlertManager webhook. 2020-03-25 11:39:05 -07:00
Anders Kaseorg
f247721a2d tests: Fix test_banned_desktop_app_versions for 2.1.x.
ZulipTestCase.login_user was not added until commit
1b16693526 (#14176).

Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
2020-03-25 00:39:55 -07:00
Tim Abbott
e3d6b4f210 compatibility: Add more strict desktop app blocking.
This allows us to block use of the desktop app with insecure versions
(we simply fail to load the Zulip webapp at all, instead rendering an
error page).

For now we block only versions that are known to be both insecure and
not auto-updating, but we can easily adjust these parameters in the
future.
2020-03-24 20:35:21 -07:00
YashRE42
ea8e6149da templates: Extract navbar_alerts to seperate file.
This is a prep-commit for the new navbar style, seperating
navbar_alerts.html from navbar.html in order to make the structure and
styles of navbar.html  easier to tweak.
These templates have very little to do with each other to begin with
appart from the fact that they are both rendered at the top of the app.
2020-03-24 20:35:17 -07:00
Rohitt Vashishtha
376cd88a83 tests: Treat github.com/zulip links as external.
Tests for these links often result in rate-limiting from GitHub,
leading to the builds failing in Circle CI. We temporarily mark
github.com/zulip links as external to keep the builds passing.
2020-03-19 17:26:53 +01:00
Mateusz Mandera
bfd92260fd requirements: Bump python-social-auth version. 2020-03-19 16:58:57 +01:00
Mateusz Mandera
217431d0c4 auth: Monkey patch a fix for Github deprecation notice spam.
This is a way to monkey-patch a fix for
https://github.com/python-social-auth/social-core/issues/430
Changes from this commit should be reverted once the issue is fixed
upstream.
2020-03-03 15:49:18 -08:00
Mateusz Mandera
30cc6798b3 auth: Fix Github auth with organization/team membership restriction.
We need to request access to read:org scope to be able to check org/team
membership. Without it SOCIAL_AUTH_GITHUB_ORG_NAME and
SOCIAL_AUTH_GITHUB_TEAM_ID settings don't work and simply lead to all
auth attempts failing.
Tested manually.
2020-03-01 15:30:10 +01:00
Tim Abbott
677ad69555 docs: Update draft changelog for 2.1.3. 2020-02-28 17:06:22 -08:00
Mateusz Mandera
95118d860d home: Don't assume user agent header is set for insecure_desktop_app.
The header may not be set - this leads to CI failures on 2.1.x branch,
but in any case is a real bug.
2020-02-28 17:01:26 -08:00
Tim Abbott
b8888c801b panels: Show a banner for users with legacy desktop apps.
Users who are using ZulipDesktop or haven't managed to auto-update to
ZulipElectron should be strongly encouraged to upgrade.

We'll likely want to move to something even stricter that blocks
loading the app at all, but this is a good start.
2020-02-28 05:29:25 -08:00
Vishnu KS
7a9251a3e1 actions: Make do_change_plan_type support changing plan to SELF_HOSTED.
Credits to @xpac1985 for reporting, debugging and proposing fix to the
issue. The proposed fix was modified slightly by @hackerkid to set the
correct value for max_invites and upload_quota_gb. Tests added by
@hackerkid.

Fixes #13974
2020-02-25 16:16:48 -08:00
Pragati Agrawal
64ec413940 settings user groups: Fix organization admin can not create user groups.
The bug was in complex `if` condition, which should mean that users should
be allowed to create a User group only when they are either admin or user
group creation policy is set to everyone.

Fixes: #13909.
2020-02-24 12:16:36 -08:00
Mateusz Mandera
147c3998de tests: Adjust failing test on 2.1.x branch.
The KeyError is getting formatted a bit differently on the 2.1.x branch.
2020-02-24 12:11:59 -08:00
Mateusz Mandera
79fc9c3281 saml: Add SOCIAL_AUTH_SAML_SECURITY_CONFIG to default_settings.
SOCIAL_AUTH_SAML_SECURITY_CONFIG["authnRequestsSigned"] override in
settings.py in a previous commit wouldn't work on servers old enough to
not have the SAML settings in their settings.py - due to
SOCIAL_AUTH_SAML_SECURITY_CONFIG being undefined.
This commit fixes that.
2020-02-21 09:30:36 -08:00
Mateusz Mandera
a33d7f0400 saml: Make the bad idp param KeyError log message more verbose.
Original idea was that KeyError was only going to happen there in case
of user passing bad input params to the endpoint, so logging a generic
message seemed sufficient. But this can also happen in case of
misconfiguration, so it's worth logging more info as it may help in
debugging the configuration.
2020-02-20 14:49:41 -08:00
Mateusz Mandera
2471f6ad83 saml: Use rsa-sha256 as the default signature algorithm.
python3-saml uses the insecure rsa-sha1 as default.
2020-02-20 14:47:51 -08:00
Vishnu KS
19d1ca3a1d management: Make backup command work when DB is not in localhost.
This is useful preparatory work for supporting the backup management
command inside docker-zulip.
2020-02-19 14:22:47 -08:00
Anders Kaseorg
9fcbc3a49b puppet: Fix regeneration of memcached-sasldb2 on password changes.
Puppet doesn’t re-run an exec blocks that’s declared as creating an
existing file, even if it’s notified.  Remove the creates declaration.

Fixes #13730.

Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
2020-02-19 14:21:39 -08:00
Anders Kaseorg
1413fda773 restore-backup: Run generate_secrets.py.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
2020-02-19 14:21:38 -08:00
Tim Abbott
494e596be8 Draft release notes for 2.1.3. 2020-02-19 12:28:27 -08:00
Tim Abbott
4cc25f8e84 i18n: Add missing translation tags to typing notifications.
Thanks to Andrea Soc for the report.
2020-02-19 12:28:27 -08:00
Tim Abbott
19ab295172 email_notifications: Fix missing translation tags on sender. 2020-02-19 12:28:26 -08:00
Tim Abbott
31f02cd926 test_fixtures: Fix buggy reuse of status_dir between databases.
Apparently, the arguments passed to template_database_status were
incorrect for the manual testing development database, in that we
didn't pass a status_dir when calling into that code from provision.

The result was that provisioning before running `test-backend` would
ignore changes to the list of check_files (etc.) made after rebasing,
and vice versa.

The cleanest fix is to compute status_dir from other values passed in;
I'm also going to open a follow-up issue for creating a better overall
interface here.
2020-02-19 12:28:26 -08:00
Tim Abbott
266c7c83e0 test_fixtures: Note populate_db depends on server_initialization.py.
This should ensure that folks rebasing past this commit from an older
database model get their database rebuilt in the way that will
match the test_subs.py query count of 40.
2020-02-19 12:08:55 -08:00
Hemanth V. Alluri
dd198fd06e webhooks/ansibletower: Update for AWX 9.1.1.
Add a simple compatibility function for AWX 9.x.x. Before AWX 9.x.x
a "friendly_name" key was sent by default. Afterwards it was removed
from being a default key but we can still more or less determine if
the triggering event was a job from the REST-style URL.

Note: It is also technically possible to add the key back by defining
a custom notification template in AWX/Tower.

Resolves #13295.
2020-02-19 12:08:55 -08:00
xpac1985
10e8928b0f docs: Add info about zulip-announce RSS feed to install docs.
The mailing list can also be subscribed to via RSS/Atom feeds, I just wanted to make that information easier accessible.
2020-02-19 12:08:55 -08:00
Ray Kraesig
bc81275d3c register: Ensure future client_capabilities fields are optional.
The `notification_settings_null` field of the `client_capabilities`
parameter is, apparently unintentionally, required.

This is mostly harmless. However, if any _future_ fields are made
required, all existing clients using this parameter will break, and it
will be needlessly difficult for new clients to specify new
capabilities in a backwards-compatible way.

Attempt to stave that possibility off with warnings.

(No functional changes.)
2020-02-19 12:08:55 -08:00
Tim Abbott
6c8c3cd3dc settings: Fix copy-from-clipboard behavior for bot tokens.
We do this by cleaning up the API for generate_zuliprc_content,
allowing us to deduplicate the previously incorrect code.
2020-02-19 12:08:55 -08:00
Vishnu KS
1783515794 emails: Use the word email instead of message in do not reply sentence.
Fixes #13693
2020-02-19 12:08:55 -08:00
Vishnu KS
21026d984b emails: Remove unecessary call to message_content_allowed_in_missedmessage_emails. 2020-02-19 12:08:55 -08:00
Vishnu KS
66fe724c8a emails: Show proper message when email content is not shown. 2020-02-19 12:08:55 -08:00
Vishnu KS
282d6edf2e tests: Check whether body include multiple strings in _test_cases. 2020-02-19 12:08:55 -08:00
Mateusz Mandera
785a7ec9e7 email_mirror: Handle encoded attachment filenames. 2020-02-19 12:08:55 -08:00
Mateusz Mandera
c44d9f9b1b email_mirror: Extract handle_header_content function. 2020-02-19 12:08:55 -08:00
Tim Abbott
0d5d3c4912 email_mirror: Rewrite docstrings to focus on current reality.
These docstrings hadn't been properly updated in years, and bad an
awkward mix of a bad version of the user-facing documentation and
details that are no longer true (e.g. references to "Voyager").

(One important detail is that we have real documentation for this
system now).
2020-02-19 12:08:55 -08:00
Mateusz Mandera
ef793590c1 email_mirror: Parse encoded From headers with show_sender=True. 2020-02-19 12:08:55 -08:00
Tim Abbott
3032ba15cf soft_deactivation: Fix incorrect logging function.
Using logging.info() rather than logger.info() meant that our
zulip.soft_deactivation logger configuration (which, in particular,
included not logging to the console) was not active on this log line,
resulting in the `manage.py soft_deactivate_users` cron job sending
emails every time it ran.

Fixes #13750.
2020-02-19 12:08:55 -08:00
Tim Abbott
96a2ddffe7 docs: Add link from LDAP docs to invitation docs.
This addresses confusion we had with some organizations where they
were surprised that with only LDAP enabled, the "invite more users"
feature was available.

Fixes #11685.
2020-02-19 12:08:55 -08:00
Tim Abbott
2794362214 slack import: Fix handling of messages sent by user U00. 2020-02-19 12:08:55 -08:00
Vishnu KS
9b3e1e2c97 emails: Set alt attribute to empty for leading images.
The alt text of the leading images were displayed as preview
content in inbox by email clients like gmail. Since the leading
images were used mostly for decoration this made the preview
content gibberish. It's fine to set the alt attributes to empty
from accessibility point of view since the old alt attributes
did't added any meaningful information.
2020-02-19 12:08:55 -08:00
orientor
ae44fdd7cc settings: Fix buggy emoji format loading spinner.
When a user clicked the current emoji format in "display settings",
we'd show an infinite loading spinner (basically as a side effect of
trying to tell the server to change the emoji format to what it
already was).

Fix this by aborting early if the emoji format is already the option
that the user clicked.

Fixes #13684.
2020-02-19 12:08:55 -08:00
Tim Abbott
b45cce61e7 message_list_view: Fix handling of links to deleted streams.
Previously, links to deleted streams would be incorrectly rendered as
stream's name).

Fixes an issue that was reported where after deleting the "general"
stream, the welcome turtle messages might appear as links to
2020-02-19 12:08:55 -08:00
Tim Abbott
2e923a0eb5 slack import: Improve error messages around invalid tokens.
This updates our error handling of invalid Slack API tokens (and other
networking error handling) to mostly make sense:
* A token that doesn't start with `xoxp-` gives an extended error early.
* An AssertionError for the codebase is correctly declared as such.
* We check for token shape errors before querying the Slack API.

We could still do useful work to raise custom exception classes here.

Thanks to @stavrospat for raising this issue.
2020-02-19 12:08:55 -08:00
Mateusz Mandera
f538f34d95 email_mirror: Use .walk() to search all MIME parts for attachments.
Fixes #13416

We used to search only one level in depth through the MIME structure,
and thus would miss attachments that were nested deeper (which can
happen with some email clients). We can take advantage of message.walk()
to iterate through each MIME part.
2020-02-19 12:08:55 -08:00
Mateusz Mandera
5d2befdc54 send_to_email_mirror: Fix loop setting recipient-like headers.
return in that loop was a bug, which would lead to the To: header not
being set even though data['recipient'] = str(message['To']) is being
run next, thus requiring the header. We can remove the return
statement and now the loop will overwrite all the potentially
troublesome headers.
2020-02-19 12:08:55 -08:00
Mateusz Mandera
cc8b83b261 email_mirror: Insert a new line before attachment links. 2020-02-19 12:08:55 -08:00
Mateusz Mandera
ac8f4aaa93 email_mirror: Check address usability in get_missed_message_address. 2020-02-19 12:08:55 -08:00
Mateusz Mandera
843c148c59 email_mirror: Give extract_and_validate a more descriptive name. 2020-02-19 12:08:55 -08:00
Mateusz Mandera
d39bcf2264 email_mirror: Reuse exception messages in mirror_email_message. 2020-02-19 12:08:55 -08:00
Tim Abbott
ce64a6b163 default stream groups: Fix broken registration UI.
The default stream groups feature (#6693) was never fully implemented;
this fixes a key detail (the registration UI being broken).
2020-02-19 12:08:55 -08:00
Tim Abbott
7875196783 default stream groups: Fix buggy LDAP behavior.
With LDAP authentication, we don't currently have a good way to
support the default stream groups feature.

The old behavior was just to assume a user select every default stream
group, which seems wrong; since we didn't prompt the user about these,
we should just ignore the feature.
2020-02-19 12:08:55 -08:00
Mateusz Mandera
56c1ad1a3d install: Don't create internal realm in the installation process. 2020-02-19 17:05:28 +01:00
Tim Abbott
d9aa4161f8 install: Remove references to "Zulip Voyager".
"Zulip Voyager" was a name invented during the Hack Week to open
source Zulip for what a single-system Zulip server might be called, as
a Star Trek pun on the code it was based on, "Zulip Enterprise".

At the time, we just needed a name quickly, but it was never a good
name, just a placeholder.  This removes that placeholder name from
much of the codebase.  A bit more work will be required to transition
the `zulip::voyager` Puppet class, as that has some migration work
involved.
2020-02-19 17:00:17 +01:00
Mateusz Mandera
728155afee server_initialization: Add server_initialized function. 2020-02-19 16:59:56 +01:00
Mateusz Mandera
660501c782 test_classes: Fix bug where UserProfile could be passed to client_post.
It would cause JSON overflow error while producing URL coverage report.
2020-02-19 16:59:14 +01:00
Mateusz Mandera
ad974c3ae3 initialize_voyager_db: Deduplicate create_internal_realm logic.
zerver.lib.server_initialization.create_internal has precisely the same
code (you can copy-and-paste swap them, with one level of indentation
adjustment, without generating any diff) so they can be trivially
deduplicated.
2020-02-19 16:57:44 +01:00
Mateusz Mandera
bc4029deae initialize_voyager_db: Deduplicate create_users.
zerver.lib.server_initialization.create_users has precisely the same
code (you can copy-and-paste swap them without generating any diff) so
they can be trivially deduplicated.
2020-02-19 16:54:39 +01:00
Mateusz Mandera
218ca61dd0 server_initialization: Rename some variables.
This makes the code of create_internal_realm identical to the
corresponding block in initialize_voyager_db.py.
2020-02-19 16:43:18 +01:00
Mateusz Mandera
3419908f39 initialize_voyager_db: Add comment above default client creation block. 2020-02-19 16:42:57 +01:00
Mateusz Mandera
af67990f14 server_initialization: Set internal bots owners to themselves. 2020-02-19 16:42:39 +01:00
Mateusz Mandera
e6cf30fc22 server_initialization: Remove unnecessary type annotation. 2020-02-19 16:42:15 +01:00
Mateusz Mandera
e2ccbe7c80 initialize_voyager_db: Add bot_owner argument to create_users.
This doesn't change any behavior, the purpose of this is to make the
function identical to what we have in server_initialization.py so that
it can be deduplicated in follow-up commits.
2020-02-19 16:41:56 +01:00
Mateusz Mandera
8b31387670 server_initialization: Use tos_version argument in create_users. 2020-02-19 16:41:31 +01:00
Mateusz Mandera
501eb09716 populate_db: Extract default client creation to server_initialization. 2020-02-19 16:25:30 +01:00
Mateusz Mandera
280d9db26d populate_db: Extract some functions to server_initialization.py. 2020-02-19 16:23:51 +01:00
Vishnu KS
cee6227f53 bots: Remove feedback cross realm bot.
This completes the remaining pieces of removing this missed in
d70e799466 (mostly in tests).
Backported to 2.1.x branch.
2020-02-19 16:21:02 +01:00
Mateusz Mandera
cae803e8a9 bots: Remove FEEDBACK_BOT implementation.
This legacy cross-realm bot hasn't been used in several years, as far
as I know.  If we wanted to re-introduce it, I'd want to implement it
as an embedded bot using those common APIs, rather than the totally
custom hacky code used for it that involves unnecessary queue workers
and similar details.
Backported to the 2.1.x branch.
2020-02-19 15:26:08 +01:00
Tim Abbott
ba598366e9 Release Zulip Server 2.1.2. 2020-01-16 12:26:14 -08:00
Steve Howell
d452ad31e0 server: Sort user_ids in recent PM conversations.
This change should prevent test flakes, plus
it's more deterministic behavior for clients,
who will generally comma-join the ids into
a key for their internal data structures.

I was able to verify test coverage on this
by making the sort reversed, which would
cause test_huddle_send_message_events to
fail.
2020-01-16 12:25:11 -08:00
Steve Howell
aed813f44c bug fix: Fix huddles in "Private Messages".
If two user_ids in a recent huddle have ids
that sort lexically differently than numerically,
such as 7 and 66, then we were creating two
different buckets in pm_conversations.

This regression was introduced in
263ac0eb45 on
November 21, 2019.
2020-01-16 12:25:11 -08:00
Steve Howell
71dae1b92a refactor: Have pm_conversations take user_ids.
Instead of having our callers pass in a possibly
non-canonical version of a user_ids_string, just
have them pass in a list.

The next commit will canonicalize the sort.
2020-01-16 12:25:11 -08:00
Steve Howell
629ec1aa8b tests: Use tricky server data in unit tests.
The server may send us ids in the order
[11, 2], instead of [2, 11].  We don't want
to rely on server behavior, regardless, for
the sort.

Our tests now show we process that data.

The current code is is still buggy and causes
us to show the same huddle two different times
for situations where the lexical sort doesn't
match the numerical sort.

This happens on czo often, where Tim is user
7, and his id sorts lexically after ids like
58, 622, 4444, etc.
2020-01-16 12:25:11 -08:00
Anders Kaseorg
87d60a1fff thumbnail: Tighten fix for CVE-2019-19775 open redirect.
Due to a known but unfixed bug in the Python standard library’s
urllib.parse module (CVE-2015-2104), a crafted URL could bypass the
validation in the previous patch and still achieve an open redirect.

https://bugs.python.org/issue23505

Switch to using django.utils.http.is_safe_url, which already contains
a workaround for this bug.

Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
2020-01-16 11:41:12 -08:00
Tim Abbott
98eef54e4f i18n: Update translation data from Transifex. 2020-01-16 11:41:12 -08:00
Tim Abbott
235ba339d0 filter: Allow marking is:mentioned messages as read.
We may revisit this in the future, but similar to is:private, the
current Zulip user experience makes users expect that in the
is:mentioned view, they should really be able to mark messages as
read.

Further, the practice use case for not marking them as read is very
low, since it's rare for someone to have so many mentions that
revisiting the mentions view isn't sufficient to see everything that
needs their attention.
2020-01-16 11:15:46 -08:00
Tim Abbott
e5320cc1f6 filter: Add streams:public to sorted_term_types.
This is for consistency with in:, has:, and similar values where
there's a fixed set of RHS entries.
2020-01-16 11:15:04 -08:00
Rohitt Vashishtha
1d72ea2fd5 filter: Remove is_exactly().
Previously, is_exactly() had already been repalced with can_bucket_by().
This commit removes is_exactly() and replaces its usage in our tests
with can_bucket_by().
2020-01-16 11:14:44 -08:00
Steve Howell
c7948a7960 filter: Remove redundant is:private operators.
If we have a pm-with, then is:private is redundant
and just forces us to write confusing/verbose code
in various places.
2020-01-16 11:14:18 -08:00
Rohitt Vashishtha
04bb26be3a unreads: Remove is_reading_mode().
This was a part of an experiment we ran on chat.zulip.org in Jul 2018
and surrounding code that used it never got merged to master.

See: https://chat.zulip.org/#narrow/stream/2-general/topic/un-narrow.20view/near/609506
and c407ba5175.
2020-01-16 11:13:34 -08:00
Rohitt Vashishtha
7f45ca9b22 filter: Add 'in:*' to sorted_term_types.
This simplifies our handling of in-home and in-all cases in
can_mark_messages_read().
2020-01-16 11:13:16 -08:00
Steve Howell
1bedb965e9 refactor: Clean up can_mark_messages_read.
We now explicitly enumerate various cases, which
should make it easier to change this code.
2020-01-16 11:13:11 -08:00
Anders Kaseorg
bc752188e7 create-db.sql: Start by dropping the zulip database if needed.
At some point the PostgreSQL Docker image started creating the zulip
database for us, which caused our CREATE DATABASE to fail.

Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
2020-01-15 18:12:35 -08:00
Anders Kaseorg
b0ea81fe16 create-db.sql: Handle exception if zulip user already exists.
Fixes #13530.

Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
2020-01-15 18:12:31 -08:00
Anders Kaseorg
358ab821c4 generate_secrets: Enable redis authentication in production.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
2020-01-15 14:03:15 -08:00
Anders Kaseorg
97322dd195 generate_secrets: Enable memcached authentication in production.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
2020-01-15 14:03:15 -08:00
Anders Kaseorg
1ba48a04da settings: Support optional memcached authentication.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
2020-01-15 14:03:15 -08:00
Tim Abbott
e8377b605f migrations: Fix zulipinternal migration corner case.
It's theoretically possible to have configured a Zulip server where
the system bots live in the same realm as normal users (and may have
in fact been the default in early Zulip releases?  Unclear.).  We
should handle these without the migration intended to clean up naming
for the system bot realm crashing.

Fixes #13660.
2020-01-15 14:02:07 -08:00
Tim Abbott
830f1e9f3f populate_db: Fix cache flushing when rebuilding test database.
This fixes a similar problem to the last commit; we don't use
memcached with the test database, so we don't need to flush memcached
when rebuilding it.

(And if we try, we'll get exceptions trying to access the relevant
settings).
2020-01-13 18:23:48 -08:00
Tim Abbott
037b87b580 populate_db: Fix handling of memcached flushing.
Our recent fixes to using the system's configured memcached settings
broke populate_db, because its hacky clear_database helper is called
with a hacked-up settings module.

We fix this by first moving this out-of-place code from models.py into
populate_db, and then saving the settings required to access memcached
so that we can use them in clear_database.

We also fix a mypy erorr in flush-memcached that matches the same
issue fixed in clear_database.
2020-01-13 18:23:44 -08:00
Anders Kaseorg
82a6e77301 flush-memcached: Use pylibmc.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
2020-01-13 17:39:25 -08:00
Anders Kaseorg
9efb90510c clear_database: Respect MEMCACHED_LOCATION.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
2020-01-13 17:39:22 -08:00
Anders Kaseorg
b255c8b8a6 puppet: Fix zuli-redis.conf path typo.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
2020-01-13 17:37:28 -08:00
Anders Kaseorg
03e8e8be9d puppet: Delete legacy rediscleanup code.
It was added in commit 9afb1c7a71 from
before 1.4.0.

Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
2020-01-13 17:37:25 -08:00
Ray Kraesig
2932d9cd28 docs: link to more-currently-maintained fork of GitX
The well-known rowanj/gitx repository hasn't been updated since 2014.
Preferentially direct new contributors to gitx/gitx instead.

(We retain the rowanj repo as a fallback, since it has precompiled
releases available.)
2020-01-13 17:36:47 -08:00
Tim Abbott
0baa205ad3 find_team: Send find team emails from the support address.
This is for consistency with the email's body, which claims replying
directly will work.
2020-01-13 17:36:37 -08:00
Tim Abbott
a8d8500c46 design: Fix missing rendered_markdown class on /me content.
There may be a deeper issue that various JavaScript logic expects
every message to have a `.message_content` element, but we definitely
should have the `.rendered_markdown` class on any markdown content.

Fixes #13634.
2020-01-13 17:36:20 -08:00
Mateusz Mandera
aa19f43f0b email_mirror: Move send_to_mm_address code to process_missed_message.
process_missed_message did nothing other than calling
send_to_missed_message_address with the same arguments, so there's no
reason to have these as separate functions.
2020-01-13 17:35:41 -08:00
Mateusz Mandera
0974b0130d email_mirror: Migrate missed message addresses from redis to database.
Addresses point 1 of #13533.

MissedMessageEmailAddress objects get tied to the specific that was
missed by the user. A useful benefit of that is that email message sent
to that address will handle topic changes - if the message that was
missed gets its topic changed, the email response will get posted under
the new topic, while in the old model it would get posted under the
old topic, which could potentially be confusing.

Migrating redis data to this new model is a bit tricky, so the migration
code has comments explaining some of the compromises made there, and
test_migrations.py tests handling of the various possible cases that
could arise.
2020-01-13 17:35:37 -08:00
Mateusz Mandera
8a1d2bb5b6 models: Add MissedMessageEmailAddress class.
Preparatory commit for making the email mirror use the database instead
of redis for missed message addresses.

This model will represent missed message email addresses, which
currently have their data stored in redis.
The redis data will be converted and migrated into these models and
the email mirror will start using them in the main commit.
2020-01-13 17:35:34 -08:00
Tim Abbott
a38976f25d slack import: Clarify confusion around xoxe- tokens. 2020-01-13 17:35:25 -08:00
Anders Kaseorg
fccfc02981 install: Run generate_secrets.py before zulip-puppet-apply.
Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
2020-01-13 17:35:16 -08:00
Mateusz Mandera
929847ae2d test_helpers: Set Recipient class attrs in use_db_models.
Model classes fetched through apps.get_model don't get methods or class
attributes. It's not feasible to add them to all these objects in
use_db_models, but Recipient.PERSONAL etc. are worth setting, since
doing that increases the range of functions that can successfully be
imported and called in test_migrations.py.
2020-01-13 17:34:40 -08:00
Mateusz Mandera
a3338f3735 test_email_notifs: Clean up mocking.
These tests had a lot of very repetetive, identical mocking, in some
tests without even doing anything with the mocks. It's cleaner to put
the mock in the one relevant, common place for all the tests that need
it, and remove it from tests who had no use for the mocking.
2020-01-13 17:34:04 -08:00
Mateusz Mandera
f377ef6dd7 api: Return a JsonableError if API key of invalid format is given. 2020-01-13 17:34:01 -08:00
Mateusz Mandera
4c9997a523 utils: Add a function to check if string can be an API key. 2020-01-13 17:33:48 -08:00
Mateusz Mandera
2470fba95c cache: Validate keys before passing them to memcached.
Fixes #13504.

This commit is purely an improvement in error handling.

We used to not do any validation on keys before passing them to
memcached, which meant for invalid keys, memcached's own key
validation would throw an exception.  Unfortunately, the resulting
error messages are super hard to read; the traceback structure doesn't
even show where the call into memcached happened.

In this commit we add validation to all the basic cache_* functions, and
appropriate handling in their callers.

We also add a lot of tests for the new behavior, which has the nice
effect of giving us decent coverage of all these core caching
functions which previously had been primarily tested manually.
2020-01-13 17:33:41 -08:00
Mateusz Mandera
2a6145f7fb default_settings: Fix inaccurate "below" phrase in comments.
These are leftovers from where we had default settings in the
settings.py file. Now that the files are separate those references to
"below" are not correct.
2020-01-03 16:54:23 -08:00
Mateusz Mandera
7036fea97b docs: Fix missing apostrophe in EMAIL_HOST_USER value. 2020-01-03 16:54:21 -08:00
Mateusz Mandera
05a42fb8df docs: Fix incorrect path to get-django-setting script. 2020-01-03 16:54:17 -08:00
Mateusz Mandera
cd0b14ce2f docs: Add some troubleshooting notes for ldap. 2020-01-03 16:54:15 -08:00
Mateusz Mandera
a1fc8fb079 ldap: Protect against troublesome deactivations in ldap sync.
If ldap sync is run while ldap is misconfigured, it can end up causing
troublesome deactivations due to not finding users in ldap -
deactivating all users, or deactivating all administrators of a realm,
which then will require manual intervention to reactivate at least one
admin in django shell.
This change prevents such potential troublesome situations which are
overwhelmingly likely to be unintentional. If intentional, --force
option can be used to remove the protection.
2020-01-03 16:54:11 -08:00
Mateusz Mandera
e147ee2087 docs: Include suggested USERNAME_ATTR in example AD ldap configs. 2020-01-03 16:54:08 -08:00
Mateusz Mandera
61180020c1 ldap: Improve logging.
Our ldap integration is quite sensitive to misconfigurations, so more
logging is better than less to help debug those issues.
Despite the following docstring on ZulipLDAPException:

"Since this inherits from _LDAPUser.AuthenticationFailed, these will
be caught and logged at debug level inside django-auth-ldap's
authenticate()"

We weren't actually logging anything, because debug level messages were
ignored due to our general logging settings. It is however desirable to
log these errors, as they can prove useful in debugging configuration
problems. The django_auth_ldap logger can get fairly spammy on debug
level, so we delegate ldap logging to a separate file
/var/log/zulip/ldap.log to avoid spamming server.log too much.
2019-12-30 10:24:01 -08:00
Mateusz Mandera
2a473c57f4 ldap: Use a cleaner super().authenticate() call in ZulipLDAPAuthBackend. 2019-12-30 10:23:58 -08:00
Anders Kaseorg
c0980e3e9e templates: Correct sample Google authorized redirect URI.
The required URI was changed in #11450.

Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
2019-12-30 10:23:43 -08:00
Anders Kaseorg
035d4c57be test-install: Use lxc-destroy -f instead of lxc-stop.
Fixes this error after rebooting the host:

$ sudo ./destroy-all  -f
zulip-install-bionic-41MM2
lxc-stop: zulip-install-bionic-41MM2: tools/lxc_stop.c: main: 191 zulip-install-bionic-41MM2 is not running

Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
2019-12-30 10:23:39 -08:00
Anders Kaseorg
fcbd24e72c test-install: Run lxc-attach with --clear-env.
The host environment variables (especially PATH) should not be allowed
to pollute the test and could interfere with it.

This allows test-install to run on a NixOS host.

Signed-off-by: Anders Kaseorg <anders@zulipchat.com>
2019-12-30 10:23:35 -08:00
Tim Abbott
29babba85a ldap: Fix bad interaction between EMAIL_ADDRESS_VISIBILITY and LDAP sync.
A block of LDAP integration code related to data synchronization did
not correctly handle EMAIL_ADDRESS_VISIBILITY_ADMINS, as it was
accessing .email, not .delivery_email, both for logging and doing the
mapping between email addresses and LDAP users.

Fixes #13539.
2019-12-30 10:23:18 -08:00
Tim Abbott
49ff894d6a Release Zulip Server 2.1.1. 2019-12-13 16:36:21 -08:00
Tim Abbott
f3e75b6b5f docs: Rewrite LDAP discussion of AUTH_LDAP_REVERSE_EMAIL_SEARCH.
This moves the mandatory configuration for options A/B/C into a single
bulleted list for each option, rather than split across two steps; I
think the result is significantly more readable.

It also fixes a bug where we suggested setting
AUTH_LDAP_REVERSE_EMAIL_SEARCH = AUTH_LDAP_USER_SEARCH in some cases,
whereas in fact it will never work because the parameters are
`%(email)s`, not `%(user)s`.

Also, now that one needs to set AUTH_LDAP_REVERSE_EMAIL_SEARCH, it
seems worth adding values for that to the Active Directory
instructions.  Thanks to @alfonsrv for the suggestion.
2019-12-13 16:32:56 -08:00
Vishnu KS
6b9f37dc8f install: Use crudini for storing value of POSTGRES_MISSING_DICTIONARIES.
This simplifies the RDS installation process to avoid awkwardly
requiring running the installer twice, and also is significantly more
robust in handling issues around rerunning the installer.

Finally, the answer for whether dictionaries are missing is available
to Django for future use in warnings/etc. around full-text search not
being great with this configuration, should they be required.
2019-12-13 16:32:48 -08:00
Mateusz Mandera
cd926b8aae migrations: Avoid triggering backend initalization in migration 0209.
Fixes #13528.
The email_auth_enabled check caused all enabled backends to get
initialized, and thus if LDAP was enabled the check_ldap_config()
check would cause an error if LDAP was misconfigured
(for example missing the new settings).
2019-12-13 10:57:38 -08:00
2642 changed files with 143554 additions and 218020 deletions

View File

@@ -9,24 +9,19 @@ aliases:
run:
name: create cache directories
command: |
dirs=(/srv/zulip-{npm,venv,emoji}-cache)
sudo mkdir -p "${dirs[@]}"
sudo chown -R circleci "${dirs[@]}"
dirs=(/srv/zulip-{npm,venv}-cache)
sudo mkdir -p "${dirs[@]}"
sudo chown -R circleci "${dirs[@]}"
- &restore_cache_package_json
restore_cache:
keys:
- v1-npm-base.{{ .Environment.CIRCLE_JOB }}-{{ checksum "package.json" }}-{{ checksum "yarn.lock" }}
- v1-npm-base.{{ .Environment.CIRCLE_JOB }}-{{ checksum "package.json" }}-{{ checksum "yarn.lock" }}
- &restore_cache_requirements
restore_cache:
keys:
- v1-venv-base.{{ .Environment.CIRCLE_JOB }}-{{ checksum "requirements/thumbor-dev.txt" }}-{{ checksum "requirements/dev.txt" }}
- &restore_emoji_cache
restore_cache:
keys:
- v1-venv-base.{{ .Environment.CIRCLE_JOB }}-{{ checksum "tools/setup/emoji/emoji_map.json" }}-{{ checksum "tools/setup/emoji/build_emoji" }}-{{checksum "tools/setup/emoji/emoji_setup_utils.py" }}-{{ checksum "tools/setup/emoji/emoji_names.py" }}-{{ checksum "package.json" }}
- v1-venv-base.{{ .Environment.CIRCLE_JOB }}-{{ checksum "requirements/thumbor.txt" }}-{{ checksum "requirements/dev.txt" }}
- &install_dependencies
run:
@@ -45,11 +40,11 @@ aliases:
rm -f /home/circleci/.gitconfig
# This is the main setup job for the test suite
mispipe "tools/ci/setup-backend --skip-dev-db-build" ts
mispipe "tools/ci/setup-backend" ts
# Cleaning caches is mostly unnecessary in Circle, because
# most builds don't get to write to the cache.
# mispipe "scripts/lib/clean-unused-caches --verbose --threshold 0 2>&1" ts
# mispipe "scripts/lib/clean-unused-caches --verbose --threshold 0" ts
- &save_cache_package_json
save_cache:
@@ -61,128 +56,37 @@ aliases:
save_cache:
paths:
- /srv/zulip-venv-cache
key: v1-venv-base.{{ .Environment.CIRCLE_JOB }}-{{ checksum "requirements/thumbor-dev.txt" }}-{{ checksum "requirements/dev.txt" }}
- &save_emoji_cache
save_cache:
paths:
- /srv/zulip-emoji-cache
key: v1-venv-base.{{ .Environment.CIRCLE_JOB }}-{{ checksum "tools/setup/emoji/emoji_map.json" }}-{{ checksum "tools/setup/emoji/build_emoji" }}-{{checksum "tools/setup/emoji/emoji_setup_utils.py" }}-{{ checksum "tools/setup/emoji/emoji_names.py" }}-{{ checksum "package.json" }}
- &do_bionic_hack
run:
name: do Bionic hack
command: |
# Temporary hack till `sudo service redis-server start` gets fixes in Bionic. See
# https://chat.zulip.org/#narrow/stream/3-backend/topic/Ubuntu.20bionic.20CircleCI
sudo sed -i '/^bind/s/bind.*/bind 0.0.0.0/' /etc/redis/redis.conf
key: v1-venv-base.{{ .Environment.CIRCLE_JOB }}-{{ checksum "requirements/thumbor.txt" }}-{{ checksum "requirements/dev.txt" }}
# TODO: in Travis we also cache ~/zulip-emoji-cache, ~/node, ~/misc
- &run_backend_tests
run:
name: run backend tests
command: |
. /srv/zulip-py3-venv/bin/activate
mispipe "./tools/ci/backend 2>&1" ts
mispipe ./tools/ci/backend ts
- &run_frontend_tests
run:
name: run frontend tests
command: |
. /srv/zulip-py3-venv/bin/activate
mispipe "./tools/ci/frontend 2>&1" ts
mispipe ./tools/ci/frontend ts
- &upload_coverage_report
run:
name: upload coverage report
command: |
# codecov requires `.coverage` file to be stored in pwd for
# uploading coverage results.
mv /home/circleci/zulip/var/.coverage /home/circleci/zulip/.coverage
. /srv/zulip-py3-venv/bin/activate
# TODO: Check that the next release of codecov doesn't
# throw find error.
# codecov==2.0.16 introduced a bug which uses "find"
# for locating files which is buggy on some platforms.
# It was fixed via https://github.com/codecov/codecov-python/pull/217
# and should get automatically fixed here once it's released.
# We cannot pin the version here because we need the latest version for uploading files.
# see https://community.codecov.io/t/http-400-while-uploading-to-s3-with-python-codecov-from-travis/1428/7
pip install codecov && codecov \
|| echo "Error in uploading coverage reports to codecov.io."
- &build_production
run:
name: build production
command: |
sudo apt-get update
# Install moreutils so we can use `ts` and `mispipe` in the following.
sudo apt-get install -y moreutils
mispipe "./tools/ci/production-build 2>&1" ts
- &production_extract_tarball
run:
name: production extract tarball
command: |
sudo apt-get update
# Install moreutils so we can use `ts` and `mispipe` in the following.
sudo apt-get install -y moreutils
mispipe "/tmp/production-extract-tarball 2>&1" ts
- &install_production
run:
name: install production
command: |
sudo service rabbitmq-server restart
sudo mispipe "/tmp/production-install 2>&1" ts
- &verify_production
run:
name: verify install
command: |
sudo mispipe "/tmp/production-verify 2>&1" ts
- &upgrade_postgresql
run:
name: upgrade postgresql
command: |
sudo mispipe "/tmp/production-upgrade-pg 2>&1" ts
- &check_xenial_provision_error
run:
name: check tools/provision error message on xenial
command: |
! tools/provision > >(tee provision.out)
grep -Fqx 'CRITICAL:root:Unsupported platform: ubuntu 16.04' provision.out
- &check_xenial_upgrade_error
run:
name: check scripts/lib/upgrade-zulip-stage-2 error message on xenial
command: |
! sudo scripts/lib/upgrade-zulip-stage-2 2> >(tee upgrade.err >&2)
grep -Fq 'upgrade-zulip-stage-2: Unsupported platform: ubuntu 16.04' upgrade.err
- &notify_failure_status
run:
name: On fail
when: on_fail
branches:
only: master
command: |
if [[ "$CIRCLE_REPOSITORY_URL" == "git@github.com:zulip/zulip.git" && "$ZULIP_BOT_KEY" != "" ]]; then
curl -H "Content-Type: application/json" \
-X POST -i 'https://chat.zulip.org/api/v1/external/circleci?api_key='"$ZULIP_BOT_KEY"'&stream=automated%20testing&topic=master%20failing' \
-d '{"payload": { "branch": "'"$CIRCLE_BRANCH"'", "reponame": "'"$CIRCLE_PROJECT_REPONAME"'", "status": "failed", "build_url": "'"$CIRCLE_BUILD_URL"'", "username": "'"$CIRCLE_USERNAME"'"}}'
fi
name: upload coverage report
command: |
. /srv/zulip-py3-venv/bin/activate
pip install codecov && codecov \
|| echo "Error in uploading coverage reports to codecov.io."
jobs:
"bionic-backend-frontend":
"xenial-backend-frontend-python3.5":
docker:
# This is built from tools/ci/images/bionic/Dockerfile .
# Bionic ships with Python 3.6.
- image: arpit551/circleci:bionic-python-test
# This is built from tools/circleci/images/xenial/Dockerfile .
# Xenial ships with Python 3.5.
- image: gregprice/circleci:xenial-python-4.test
working_directory: ~/zulip
@@ -190,48 +94,33 @@ jobs:
- checkout
- *create_cache_directories
- *do_bionic_hack
- *restore_cache_package_json
- *restore_cache_requirements
- *restore_emoji_cache
- *install_dependencies
- *save_cache_package_json
- *save_cache_requirements
- *save_emoji_cache
- *run_backend_tests
- run:
name: test locked requirements
command: |
. /srv/zulip-py3-venv/bin/activate
mispipe "./tools/test-locked-requirements 2>&1" ts
- *run_frontend_tests
# We only need to upload coverage reports on whichever platform
# runs the frontend tests.
# We only need to upload coverage reports on whichever platform
# runs the frontend tests.
- *upload_coverage_report
- store_artifacts:
path: ./var/casper/
destination: casper
- store_artifacts:
path: ./var/puppeteer/
destination: puppeteer
- store_artifacts:
- store_artifacts:
path: ../../../tmp/zulip-test-event-log/
destination: test-reports
- store_test_results:
path: ./var/xunit-test-results/casper/
- *notify_failure_status
path: ./var/xunit-test-results/casper/
"focal-backend":
"bionic-backend-python3.6":
docker:
# This is built from tools/ci/images/focal/Dockerfile.
# Focal ships with Python 3.8.2.
- image: arpit551/circleci:focal-python-test
# This is built from tools/circleci/images/bionic/Dockerfile .
# Bionic ships with Python 3.6.
- image: gregprice/circleci:bionic-python-1.test
working_directory: ~/zulip
@@ -239,145 +128,24 @@ jobs:
- checkout
- *create_cache_directories
- run:
name: do Bionic hack
command: |
# Temporary hack till `sudo service redis-server start` gets fixes in Bionic. See
# https://chat.zulip.org/#narrow/stream/3-backend/topic/Ubuntu.20bionic.20CircleCI
sudo sed -i '/^bind/s/bind.*/bind 0.0.0.0/' /etc/redis/redis.conf
- *restore_cache_package_json
- *restore_cache_requirements
- *restore_emoji_cache
- *install_dependencies
- *save_cache_package_json
- *save_cache_requirements
- *save_emoji_cache
- *run_backend_tests
- run:
name: Check development database build
command: mispipe "tools/ci/setup-backend" ts
- *notify_failure_status
"xenial-legacy":
docker:
- image: arpit551/circleci:xenial-python-test
working_directory: ~/zulip
steps:
- checkout
- *check_xenial_provision_error
- *check_xenial_upgrade_error
- *notify_failure_status
"bionic-production-build":
docker:
# This is built from tools/ci/images/bionic/Dockerfile .
# Bionic ships with Python 3.6.
- image: arpit551/circleci:bionic-python-test
working_directory: ~/zulip
steps:
- checkout
- *create_cache_directories
- *do_bionic_hack
- *restore_cache_package_json
- *restore_cache_requirements
- *restore_emoji_cache
- *build_production
- *save_cache_package_json
- *save_cache_requirements
- *save_emoji_cache
# Persist the built tarball to be used in downstream job
# for installation of production server.
# See https://circleci.com/docs/2.0/workflows/#using-workspaces-to-share-data-among-jobs
- persist_to_workspace:
# Must be an absolute path,
# or relative path from working_directory.
# This is a directory on the container which is
# taken to be the root directory of the workspace.
root: /tmp
# Must be relative path from root
paths:
- zulip-server-test.tar.gz
- success-http-headers.template.txt
- production-install
- production-verify
- production-upgrade-pg
- production
- production-extract-tarball
- *notify_failure_status
"bionic-production-install":
docker:
# This is built from tools/ci/images/bionic/Dockerfile .
# Bionic ships with Python 3.6.
- image: arpit551/circleci:bionic-python-test
working_directory: ~/zulip
steps:
# Contains the built tarball from bionic-production-build job
- attach_workspace:
# Must be absolute path or relative path from working_directory
at: /tmp
- *create_cache_directories
- *do_bionic_hack
- *production_extract_tarball
- *restore_cache_package_json
- *install_production
- *verify_production
- *upgrade_postgresql
- *verify_production
- *save_cache_package_json
- *notify_failure_status
"focal-production-install":
docker:
# This is built from tools/ci/images/focal/Dockerfile.
# Focal ships with Python 3.8.2.
- image: arpit551/circleci:focal-python-test
working_directory: ~/zulip
steps:
# Contains the built tarball from bionic-production-build job
- attach_workspace:
# Must be absolute path or relative path from working_directory
at: /tmp
- *create_cache_directories
- run:
name: do memcached hack
command: |
# Temporary hack till memcached upstream is updated in Focal.
# https://bugs.launchpad.net/ubuntu/+source/memcached/+bug/1878721
echo "export SASL_CONF_PATH=/etc/sasl2" | sudo tee - a /etc/default/memcached
- *production_extract_tarball
- *restore_cache_package_json
- *install_production
- *verify_production
- *save_cache_package_json
- *notify_failure_status
workflows:
version: 2
"Ubuntu 16.04 Xenial (Python 3.5, legacy)":
build:
jobs:
- "xenial-legacy"
"Ubuntu 18.04 Bionic (Python 3.6, backend+frontend)":
jobs:
- "bionic-backend-frontend"
"Ubuntu 20.04 Focal (Python 3.8, backend)":
jobs:
- "focal-backend"
"Production":
jobs:
- "bionic-production-build"
- "bionic-production-install":
requires:
- "bionic-production-build"
- "focal-production-install":
requires:
- "bionic-production-build"
- "xenial-backend-frontend-python3.5"
- "bionic-backend-python3.6"

View File

@@ -6,7 +6,7 @@ charset = utf-8
trim_trailing_whitespace = true
insert_final_newline = true
[*.{sh,py,pyi,js,ts,json,xml,css,scss,hbs,html}]
[*.{sh,py,pyi,js,ts,json,yml,xml,css,md,markdown,handlebars,html}]
indent_style = space
indent_size = 4
@@ -16,6 +16,10 @@ max_line_length = 110
[*.{js,ts}]
max_line_length = 100
[*.{svg,rb,pp}]
[*.{svg,rb,pp,pl}]
indent_style = space
indent_size = 2
[*.cfg]
indent_style = space
indent_size = 8

View File

@@ -3,346 +3,492 @@
"node": true,
"es6": true
},
"extends": [
"eslint:recommended",
"prettier"
],
"parserOptions": {
"ecmaVersion": 2019,
"warnOnUnsupportedTypeScriptVersion": false,
"sourceType": "module"
},
"globals": {
"$": false,
"ClipboardJS": false,
"Dict": false,
"FetchStatus": false,
"Filter": false,
"Handlebars": false,
"LightboxCanvas": false,
"MessageListData": false,
"MessageListView": false,
"Plotly": false,
"SockJS": false,
"Socket": false,
"Sortable": false,
"WinChan": false,
"XDate": false,
"_": false,
"activity": false,
"admin": false,
"alert_words": false,
"alert_words_ui": false,
"attachments_ui": false,
"avatar": false,
"billing": false,
"blueslip": false,
"bot_data": false,
"bridge": false,
"buddy_data": false,
"buddy_list": false,
"channel": false,
"click_handlers": false,
"color_data": false,
"colorspace": false,
"common": false,
"components": false,
"compose": false,
"compose_actions": false,
"compose_fade": false,
"compose_pm_pill": false,
"compose_state": false,
"compose_ui": false,
"composebox_typeahead": false,
"condense": false,
"confirm_dialog": false,
"copy_and_paste": false,
"csrf_token": false,
"current_msg_list": true,
"drafts": false,
"echo": false,
"emoji": false,
"emoji_codes": false,
"emoji_picker": false,
"favicon": false,
"feature_flags": false,
"feedback_widget": false,
"fenced_code": false,
"flatpickr": false,
"floating_recipient_bar": false,
"gear_menu": false,
"hash_util": false,
"hashchange": false,
"helpers": false,
"history": false,
"home_msg_list": false,
"hotspots": false,
"i18n": false,
"info_overlay": false,
"input_pill": false,
"invite": false,
"jQuery": false,
"katex": false,
"keydown_util": false,
"lightbox": false,
"list_cursor": false,
"list_render": false,
"list_util": false,
"loading": false,
"localStorage": false,
"local_message": false,
"localstorage": false,
"location": false,
"markdown": false,
"marked": false,
"md5": false,
"message_edit": false,
"message_events": false,
"message_fetch": false,
"message_flags": false,
"message_list": false,
"message_live_update": false,
"message_scroll": false,
"message_store": false,
"message_util": false,
"message_viewport": false,
"moment": false,
"muting": false,
"muting_ui": false,
"narrow": false,
"narrow_state": false,
"navigate": false,
"night_mode": false,
"notifications": false,
"overlays": false,
"padded_widget": false,
"page_params": false,
"panels": false,
"people": false,
"pm_conversations": false,
"pm_list": false,
"pointer": false,
"popovers": false,
"presence": false,
"pygments_data": false,
"reactions": false,
"realm_icon": false,
"realm_logo": false,
"realm_night_logo": false,
"recent_senders": false,
"reload": false,
"reload_state": false,
"reminder": false,
"resize": false,
"rows": false,
"rtl": false,
"run_test": false,
"schema": false,
"scroll_bar": false,
"scroll_util": false,
"search": false,
"search_pill": false,
"search_pill_widget": false,
"search_suggestion": false,
"search_util": false,
"sent_messages": false,
"server_events": false,
"server_events_dispatch": false,
"settings": false,
"settings_account": false,
"settings_bots": false,
"settings_display": false,
"settings_emoji": false,
"settings_exports": false,
"settings_linkifiers": false,
"settings_invites": false,
"settings_muting": false,
"settings_notifications": false,
"settings_org": false,
"settings_panel_menu": false,
"settings_profile_fields": false,
"settings_sections": false,
"settings_streams": false,
"settings_toggle": false,
"settings_ui": false,
"settings_user_groups": false,
"settings_users": false,
"starred_messages": false,
"stream_color": false,
"stream_create": false,
"stream_data": false,
"stream_edit": false,
"stream_events": false,
"stream_list": false,
"stream_muting": false,
"stream_popover": false,
"stream_sort": false,
"stream_ui_updates": false,
"StripeCheckout": false,
"submessage": false,
"subs": false,
"tab_bar": false,
"templates": false,
"tictactoe_widget": false,
"timerender": false,
"todo_widget": false,
"top_left_corner": false,
"topic_data": false,
"topic_generator": false,
"topic_list": false,
"topic_zoom": false,
"transmit": false,
"tutorial": false,
"typeahead_helper": false,
"typing": false,
"typing_data": false,
"typing_events": false,
"ui": false,
"ui_init": false,
"ui_report": false,
"ui_util": false,
"unread": false,
"unread_ops": false,
"unread_ui": false,
"upgrade": false,
"upload": false,
"upload_widget": false,
"user_events": false,
"user_groups": false,
"user_pill": false,
"user_search": false,
"user_status": false,
"user_status_ui": false,
"util": false,
"poll_widget": false,
"widgetize": false,
"zcommand": false,
"zform": false,
"zxcvbn": false
},
"plugins": [
"eslint-plugin-empty-returns"
],
"rules": {
"array-callback-return": "error",
"arrow-body-style": "error",
"array-bracket-spacing": "error",
"arrow-spacing": [ "error", { "before": true, "after": true } ],
"block-scoped-var": "error",
"brace-style": [ "error", "1tbs", { "allowSingleLine": true } ],
"camelcase": "off",
"comma-dangle": [ "error",
{
"arrays": "always-multiline",
"objects": "always-multiline",
"imports": "always-multiline",
"exports": "always-multiline",
"functions": "never"
}
],
"comma-spacing": [ "error",
{
"before": false,
"after": true
}
],
"complexity": [ "off", 4 ],
"curly": "error",
"dot-notation": "error",
"dot-notation": [ "error", { "allowKeywords": true } ],
"empty-returns/main": "error",
"eol-last": [ "error", "always" ],
"eqeqeq": "error",
"func-style": [ "off", "expression" ],
"guard-for-in": "error",
"indent": ["error", 4, {
"ArrayExpression": "first",
"ObjectExpression": "first",
"SwitchCase": 0,
"CallExpression": {"arguments": "first"},
"FunctionExpression": {"parameters": "first"},
"FunctionDeclaration": {"parameters": "first"}
}],
"key-spacing": [ "error",
{
"beforeColon": false,
"afterColon": true
}
],
"keyword-spacing": [ "error",
{
"before": true,
"after": true,
"overrides": {
"return": { "after": true },
"throw": { "after": true },
"case": { "after": true }
}
}
],
"max-depth": [ "off", 4 ],
"max-len": [ "error", 100, 2,
{
"ignoreUrls": true,
"ignoreComments": false,
"ignoreRegExpLiterals": true,
"ignoreStrings": true,
"ignoreTemplateLiterals": true
}
],
"max-params": [ "off", 3 ],
"max-statements": [ "off", 10 ],
"new-cap": [ "error",
{
"newIsCap": true,
"capIsNew": false
}
],
"new-parens": "error",
"newline-per-chained-call": "off",
"no-alert": "error",
"no-array-constructor": "error",
"no-bitwise": "error",
"no-caller": "error",
"no-case-declarations": "error",
"no-catch-shadow": "error",
"no-constant-condition": ["error", {"checkLoops": false}],
"no-console": "off",
"no-const-assign": "error",
"no-control-regex": "error",
"no-debugger": "error",
"no-delete-var": "error",
"no-div-regex": "error",
"no-dupe-class-members": "error",
"no-dupe-keys": "error",
"no-duplicate-imports": "error",
"no-else-return": "error",
"no-empty": "error",
"no-empty-character-class": "error",
"no-eq-null": "error",
"no-eval": "error",
"no-ex-assign": "error",
"no-extra-parens": ["error", "all"],
"no-extra-semi": "error",
"no-fallthrough": "error",
"no-floating-decimal": "error",
"no-func-assign": "error",
"no-implied-eval": "error",
"no-inner-declarations": "off",
"no-iterator": "error",
"no-label-var": "error",
"no-labels": "error",
"no-loop-func": "error",
"no-mixed-requires": [ "off", false ],
"no-multi-str": "error",
"no-native-reassign": "error",
"no-nested-ternary": "off",
"no-new-func": "error",
"no-new-object": "error",
"no-new-wrappers": "error",
"no-obj-calls": "error",
"no-octal": "error",
"no-octal-escape": "error",
"no-param-reassign": "off",
"no-plusplus": "error",
"no-proto": "error",
"no-redeclare": "error",
"no-regex-spaces": "error",
"no-restricted-syntax": "off",
"no-return-assign": "error",
"no-script-url": "error",
"no-self-compare": "error",
"no-shadow": "off",
"no-sync": "error",
"no-ternary": "off",
"no-trailing-spaces": "error",
"no-undef": "error",
"no-undef-init": "error",
"no-underscore-dangle": "off",
"no-unneeded-ternary": [ "error", { "defaultAssignment": false } ],
"no-unreachable": "error",
"no-unused-expressions": "error",
"no-unused-vars": [ "error",
{
"vars": "local",
"args": "after-used",
"varsIgnorePattern": "print_elapsed_time|check_duplicate_ids"
}
],
"no-use-before-define": "error",
"no-useless-constructor": "error",
// The Zulip codebase complies partially with the "no-useless-escape"
// rule; only regex expressions haven't been updated yet.
// Updated regex expressions are currently being tested in casper
// files and will decide about a potential future enforcement of this rule.
"no-useless-escape": "off",
"no-var": "error",
"space-unary-ops": "error",
"no-whitespace-before-property": "error",
"no-with": "error",
"one-var": [ "error", "never" ],
"prefer-arrow-callback": "error",
"padded-blocks": "off",
"prefer-const": [ "error",
{
"destructuring": "any",
"ignoreReadBeforeAssign": true
}
],
"quote-props": [ "error", "as-needed",
{
"keywords": false,
"unnecessary": true,
"numbers": false
}
],
"quotes": [ "off", "single" ],
"radix": "error",
"semi": "error",
"semi-spacing": ["error", {"before": false, "after": true}],
"sort-imports": "error",
"space-before-blocks": "error",
"space-before-function-paren": [ "error",
{
"anonymous": "always",
"named": "never",
"asyncArrow": "always"
}
],
"space-in-parens": "error",
"space-infix-ops": "error",
"spaced-comment": "off",
"strict": "off",
"template-curly-spacing": "error",
"unnecessary-strict": "off",
"use-isnan": "error",
"valid-typeof": [ "error", { "requireStringLiterals": true } ],
"wrap-iife": [ "error", "outside", { "functionPrototypeMethods": false } ],
"wrap-regex": "off",
"yoda": "error"
},
"overrides": [
{
"files": [
"frontend_tests/**/*.{js,ts}",
"static/js/**/*.{js,ts}"
],
"globals": {
"$": false,
"ClipboardJS": false,
"FetchStatus": false,
"Filter": false,
"Handlebars": false,
"LightboxCanvas": false,
"MessageListData": false,
"MessageListView": false,
"Plotly": false,
"Sortable": false,
"WinChan": false,
"XDate": false,
"_": false,
"activity": false,
"admin": false,
"alert_words": false,
"alert_words_ui": false,
"attachments_ui": false,
"avatar": false,
"billing": false,
"blueslip": false,
"bot_data": false,
"bridge": false,
"buddy_data": false,
"buddy_list": false,
"channel": false,
"click_handlers": false,
"color_data": false,
"colorspace": false,
"common": false,
"components": false,
"compose": false,
"compose_actions": false,
"compose_fade": false,
"compose_pm_pill": false,
"compose_state": false,
"compose_ui": false,
"composebox_typeahead": false,
"condense": false,
"confirm_dialog": false,
"copy_and_paste": false,
"csrf_token": false,
"current_msg_list": true,
"drafts": false,
"dropdown_list_widget": false,
"echo": false,
"emoji": false,
"emoji_picker": false,
"favicon": false,
"feature_flags": false,
"feedback_widget": false,
"fenced_code": false,
"flatpickr": false,
"floating_recipient_bar": false,
"gear_menu": false,
"hash_util": false,
"hashchange": false,
"helpers": false,
"history": false,
"home_msg_list": false,
"hotspots": false,
"i18n": false,
"info_overlay": false,
"input_pill": false,
"invite": false,
"jQuery": false,
"katex": false,
"keydown_util": false,
"lightbox": false,
"list_cursor": false,
"list_render": false,
"list_util": false,
"loading": false,
"localStorage": false,
"local_message": false,
"localstorage": false,
"location": false,
"markdown": false,
"marked": false,
"md5": false,
"message_edit": false,
"message_edit_history": false,
"message_events": false,
"message_fetch": false,
"message_flags": false,
"message_list": false,
"message_live_update": false,
"message_scroll": false,
"message_store": false,
"message_util": false,
"message_viewport": false,
"moment": false,
"muting": false,
"muting_ui": false,
"narrow": false,
"narrow_state": false,
"navigate": false,
"night_mode": false,
"notifications": false,
"overlays": false,
"padded_widget": false,
"page_params": false,
"panels": false,
"people": false,
"pm_conversations": false,
"pm_list": false,
"pm_list_dom": false,
"pointer": false,
"popovers": false,
"presence": false,
"reactions": false,
"realm_icon": false,
"realm_logo": false,
"realm_night_logo": false,
"recent_senders": false,
"recent_topics": false,
"reload": false,
"reload_state": false,
"reminder": false,
"resize": false,
"rows": false,
"rtl": false,
"run_test": false,
"schema": false,
"scroll_bar": false,
"scroll_util": false,
"search": false,
"search_pill": false,
"search_pill_widget": false,
"search_suggestion": false,
"search_util": false,
"sent_messages": false,
"server_events": false,
"server_events_dispatch": false,
"settings": false,
"settings_account": false,
"settings_bots": false,
"settings_display": false,
"settings_emoji": false,
"settings_exports": false,
"settings_linkifiers": false,
"settings_invites": false,
"settings_muting": false,
"settings_notifications": false,
"settings_org": false,
"settings_panel_menu": false,
"settings_profile_fields": false,
"settings_sections": false,
"settings_streams": false,
"settings_toggle": false,
"settings_ui": false,
"settings_user_groups": false,
"settings_users": false,
"spoilers": false,
"starred_messages": false,
"stream_color": false,
"stream_create": false,
"stream_data": false,
"stream_edit": false,
"stream_events": false,
"stream_topic_history": false,
"stream_list": false,
"stream_muting": false,
"stream_popover": false,
"stream_sort": false,
"stream_ui_updates": false,
"StripeCheckout": false,
"submessage": false,
"subs": false,
"tab_bar": false,
"templates": false,
"tictactoe_widget": false,
"timerender": false,
"todo_widget": false,
"top_left_corner": false,
"topic_generator": false,
"topic_list": false,
"topic_zoom": false,
"transmit": false,
"tutorial": false,
"typeahead_helper": false,
"typing": false,
"typing_data": false,
"typing_events": false,
"ui": false,
"ui_init": false,
"ui_report": false,
"ui_util": false,
"unread": false,
"unread_ops": false,
"unread_ui": false,
"upgrade": false,
"upload": false,
"upload_widget": false,
"user_events": false,
"user_groups": false,
"user_pill": false,
"user_search": false,
"user_status": false,
"user_status_ui": false,
"poll_widget": false,
"vdom": false,
"widgetize": false,
"zcommand": false,
"zform": false,
"zxcvbn": false
}
},
{
"files": [
"frontend_tests/casper_tests/*.js",
"frontend_tests/casper_lib/*.js"
],
"rules": {
// Dont require ES features that PhantomJS doesnt support
"no-var": "off",
"prefer-arrow-callback": "off"
"no-var": "off" // PhantomJS doesnt support let, const
}
},
{
"files": ["**/*.ts"],
"extends": [
"plugin:@typescript-eslint/recommended",
"prettier/@typescript-eslint"
],
"parser": "@typescript-eslint/parser",
"parserOptions": {
"project": "tsconfig.json"
},
"plugins": ["@typescript-eslint"],
"rules": {
// Disable base rule to avoid conflict
"empty-returns/main": "off",
"indent": "off",
"func-call-spacing": "off",
"no-magic-numbers": "off",
"semi": "off",
"no-unused-vars": "off",
"no-useless-constructor": "off",
"@typescript-eslint/adjacent-overload-signatures": "error",
"@typescript-eslint/array-type": "error",
"@typescript-eslint/await-thenable": "error",
"@typescript-eslint/ban-types": "error",
"@typescript-eslint/ban-ts-ignore": "off",
"@typescript-eslint/camelcase": "off",
"@typescript-eslint/class-name-casing": "error",
"@typescript-eslint/consistent-type-assertions": "error",
"@typescript-eslint/explicit-function-return-type": ["error", { "allowExpressions": true }],
"@typescript-eslint/explicit-member-accessibility": "off",
"@typescript-eslint/func-call-spacing": "error",
"@typescript-eslint/generic-type-naming": "off",
"@typescript-eslint/indent": "error",
"@typescript-eslint/interface-name-prefix": "off",
"@typescript-eslint/member-delimiter-style": "error",
"@typescript-eslint/member-naming": ["error", { "private": "^_" } ],
"@typescript-eslint/member-ordering": "error",
"@typescript-eslint/no-array-constructor": "error",
"@typescript-eslint/no-empty-interface": "error",
"@typescript-eslint/no-explicit-any": "off",
"@typescript-eslint/no-extraneous-class": "error",
"@typescript-eslint/no-for-in-array": "off",
"@typescript-eslint/no-inferrable-types": "error",
"@typescript-eslint/no-magic-numbers": "off",
"@typescript-eslint/no-misused-new": "error",
"@typescript-eslint/no-namespace": "error",
"@typescript-eslint/no-non-null-assertion": "off",
"@typescript-eslint/no-parameter-properties": "error",
"@typescript-eslint/no-require-imports": "off",
"@typescript-eslint/no-this-alias": "off",
"@typescript-eslint/no-type-alias": "off",
"@typescript-eslint/no-unnecessary-qualifier": "error",
"@typescript-eslint/no-unnecessary-type-assertion": "error",
"@typescript-eslint/no-unused-vars": ["error", { "varsIgnorePattern": "^_" } ],
"@typescript-eslint/no-use-before-define": "error",
"@typescript-eslint/no-useless-constructor": "error",
"@typescript-eslint/no-var-requires": "off",
"@typescript-eslint/prefer-for-of": "off",
"@typescript-eslint/prefer-function-type": "off",
"@typescript-eslint/prefer-includes": "error",
"@typescript-eslint/prefer-interface": "off",
"@typescript-eslint/prefer-namespace-keyword": "error",
"@typescript-eslint/prefer-regexp-exec": "error",
"@typescript-eslint/prefer-string-starts-ends-with": "error",
"@typescript-eslint/promise-function-async": "error",
"@typescript-eslint/restrict-plus-operands": "off",
"@typescript-eslint/semi": "error",
"@typescript-eslint/triple-slash-reference": "error",
"@typescript-eslint/type-annotation-spacing": "error",
"@typescript-eslint/unbound-method": "off",
"@typescript-eslint/unified-signatures": "error"
}
}

View File

@@ -1,30 +0,0 @@
name: "Code Scanning"
on: [push, pull_request]
jobs:
CodeQL:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v2
with:
# We must fetch at least the immediate parents so that if this is
# a pull request then we can checkout the head.
fetch-depth: 2
# If this run was triggered by a pull request event, then checkout
# the head of the pull request instead of the merge commit.
- run: git checkout HEAD^2
if: ${{ github.event_name == 'pull_request' }}
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v1
# Override language selection by uncommenting this and choosing your languages
# with:
# languages: go, javascript, csharp, python, cpp, java
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v1

View File

@@ -1,171 +0,0 @@
name: Zulip CI
on: [push, pull_request]
defaults:
run:
shell: bash
jobs:
focal_bionic:
strategy:
matrix:
include:
# This docker image was created by a generated Dockerfile at:
# tools/ci/images/bionic/Dockerfile
# Bionic ships with Python 3.6.
- docker_image: mepriyank/actions:bionic
name: Ubuntu 18.04 Bionic (Python 3.6, backend + frontend)
os: bionic
is_bionic: true
include_frontend_tests: true
# This docker image was created by a generated Dockerfile at:
# tools/ci/images/focal/Dockerfile
# Focal ships with Python 3.8.2.
- docker_image: mepriyank/actions:focal
name: Ubuntu 20.04 Focal (Python 3.8, backend)
os: focal
is_focal: true
include_frontend_tests: false
runs-on: ubuntu-latest
name: ${{ matrix.name }}
container: ${{ matrix.docker_image }}
env:
# GitHub Actions sets HOME to /github/home which causes
# problem later in provison and frontend test that runs
# tools/setup/postgres-init-dev-db because of the .pgpass
# location. Postgresql (psql) expects .pgpass to be at
# /home/github/.pgpass and setting home to `/home/github/`
# ensures it written there because we write it to ~/.pgpass.
HOME: /home/github/
steps:
- name: Add required permissions
run: |
# The checkout actions doesn't clone to ~/zulip or allow
# us to use the path option to clone outside the current
# /__w/zulip/zulip directory. Since this directory is owned
# by root we need to change it's ownership to allow the
# github user to clone the code here.
# Note: /__w/ is a docker volume mounted to $GITHUB_WORKSPACE
# which is /home/runner/work/.
sudo chown -R github .
# This is the GitHub Actions specific cache directory the
# the current github user must be able to access for the
# cache action to work. It is owned by root currently.
sudo chmod -R 0777 /__w/_temp/
- uses: actions/checkout@v2
- name: Create cache directories
run: |
dirs=(/srv/zulip-{npm,venv,emoji}-cache)
sudo mkdir -p "${dirs[@]}"
sudo chown -R github "${dirs[@]}"
- name: Restore node_modules cache
uses: actions/cache@v2
with:
path: /srv/zulip-npm-cache
key: v1-yarn-deps-${{ matrix.os }}-${{ hashFiles('package.json') }}-${{ hashFiles('yarn.lock') }}
restore-keys: v1-yarn-deps-${{ matrix.os }}
- name: Restore python cache
uses: actions/cache@v2
with:
path: /srv/zulip-venv-cache
key: v1-venv-${{ matrix.os }}-${{ hashFiles('requirements/thumbor-dev.txt') }}-${{ hashFiles('requirements/dev.txt') }}
restore-keys: v1-venv-${{ matrix.os }}
- name: Restore emoji cache
uses: actions/cache@v2
with:
path: /srv/zulip-emoji-cache
key: v1-emoji-${{ matrix.os }}-${{ hashFiles('tools/setup/emoji/emoji_map.json') }}-${{ hashFiles('tools/setup/emoji/build_emoji') }}-${{ hashFiles('tools/setup/emoji/emoji_setup_utils.py') }}-${{ hashFiles('tools/setup/emoji/emoji_names.py') }}-${{ hashFiles('package.json') }}
restore-keys: v1-emoji-${{ matrix.os }}
- name: Do Bionic hack
if: ${{ matrix.is_bionic }}
run: |
# Temporary hack till `sudo service redis-server start` gets fixes in Bionic. See
# https://chat.zulip.org/#narrow/stream/3-backend/topic/Ubuntu.20bionic.20CircleCI
sudo sed -i '/^bind/s/bind.*/bind 0.0.0.0/' /etc/redis/redis.conf
- name: Install dependencies
run: |
# This is the main setup job for the test suite
mispipe "tools/ci/setup-backend --skip-dev-db-build" ts
# Cleaning caches is mostly unnecessary in GitHub Actions, because
# most builds don't get to write to the cache.
# mispipe "scripts/lib/clean-unused-caches --verbose --threshold 0 2>&1" ts
- name: Run backend tests
run: |
. /srv/zulip-py3-venv/bin/activate && \
mispipe "./tools/ci/backend 2>&1" ts
- name: Run frontend tests
if: ${{ matrix.include_frontend_tests }}
run: |
. /srv/zulip-py3-venv/bin/activate
mispipe "./tools/ci/frontend 2>&1" ts
- name: Test locked requirements
if: ${{ matrix.is_bionic }}
run: |
. /srv/zulip-py3-venv/bin/activate && \
mispipe "./tools/test-locked-requirements 2>&1" ts
- name: Upload coverage reports
# Only upload coverage when both frontend and backend
# tests are ran.
if: ${{ matrix.include_frontend_tests }}
run: |
# Codcov requires `.coverage` file to be stored in the
# current working directory.
mv ./var/.coverage ./.coverage
. /srv/zulip-py3-venv/bin/activate || true
# TODO: Check that the next release of codecov doesn't
# throw find error.
# codecov==2.0.16 introduced a bug which uses "find"
# for locating files which is buggy on some platforms.
# It was fixed via https://github.com/codecov/codecov-python/pull/217
# and should get automatically fixed here once it's released.
# We cannot pin the version here because we need the latest version for uploading files.
# see https://community.codecov.io/t/http-400-while-uploading-to-s3-with-python-codecov-from-travis/1428/7
pip install codecov && codecov || echo "Error in uploading coverage reports to codecov.io."
- name: Store puppeteer artifacts
if: ${{ matrix.include_frontend_tests }}
uses: actions/upload-artifact@v2
with:
name: puppeteer
path: ./var/puppeteer
# We cannot use upload-artifacts actions to upload the test
# reports from /tmp, that directory exists inside the docker
# image. Move them to ./var so we access it outside docker since
# the current directory is volume mounted outside the docker image.
- name: Move test reports to var
run: mv /tmp/zulip-test-event-log/ ./var/
- name: Store test reports
if: ${{ matrix.is_bionic }}
uses: actions/upload-artifact@v2
with:
name: test-reports
path: ./var/zulip-test-event-log/
- name: Check development database build
if: ${{ matrix.is_focal }}
run: mispipe "tools/ci/setup-backend" ts
# TODO: We need to port the notify_failure step from CircleCI
# config, however, it might be the case that GitHub Notifications
# make this unnesscary. More details on settings to configure it:
# https://help.github.com/en/github/managing-subscriptions-and-notifications-on-github/configuring-notifications#github-actions-notification-options

5
.gitignore vendored
View File

@@ -32,7 +32,7 @@ package-lock.json
/.dmypy.json
# Dockerfiles generated for CircleCI
/tools/ci/images
/tools/circleci/images
# Generated i18n data
/locale/en
@@ -76,9 +76,6 @@ zulip.kdev4
.cache/
.eslintcache
# Core dump files
core
## Miscellaneous
# (Ideally this section is empty.)
zthumbor/thumbor_local_settings.py

View File

@@ -1,7 +1,10 @@
[settings]
src_paths = ., tools, tools/setup/emoji
multi_line_output = 3
known_third_party = zulip
include_trailing_comma = True
use_parentheses = True
line_length = 100
line_length = 79
multi_line_output = 2
balanced_wrapping = true
known_third_party = django, ujson, sqlalchemy
known_first_party = zerver, zproject, version, confirmation, zilencer, analytics, frontend_tests, scripts, corporate
sections = FUTURE, STDLIB, THIRDPARTY, FIRSTPARTY, LOCALFOLDER
lines_after_imports = 1
# See the comment related to ioloop_logging for why this is skipped.
skip = zerver/management/commands/runtornado.py

View File

@@ -1,29 +0,0 @@
Alex Vandiver <alexmv@zulip.com> <alex@chmrr.net>
Alex Vandiver <alexmv@zulip.com> <github@chmrr.net>
Aman Agrawal <amanagr@zulip.com> <f2016561@pilani.bits-pilani.ac.in>
Anders Kaseorg <anders@zulip.com> <anders@zulipchat.com>
Anders Kaseorg <anders@zulip.com> <andersk@mit.edu>
Brock Whittaker <brock@zulipchat.com> <bjwhitta@asu.edu>
Brock Whittaker <brock@zulipchat.com> <brockwhittaker@Brocks-MacBook.local>
Brock Whittaker <brock@zulipchat.com> <brock@zulipchat.org>
Chris Bobbe <cbobbe@zulip.com> <cbobbe@zulipchat.com>
Chris Bobbe <cbobbe@zulip.com> <csbobbe@gmail.com>
Greg Price <greg@zulip.com> <gnprice@gmail.com>
Greg Price <greg@zulip.com> <greg@zulipchat.com>
Greg Price <greg@zulip.com> <price@mit.edu>
Ray Kraesig <rkraesig@zulip.com> <rkraesig@zulipchat.com>
Rishi Gupta <rishig@zulip.com> <rishig+git@mit.edu>
Rishi Gupta <rishig@zulip.com> <rishig@kandralabs.com>
Rishi Gupta <rishig@zulip.com> <rishig@users.noreply.github.com>
Rishi Gupta <rishig@zulip.com> <rishig@zulipchat.com>
Steve Howell <showell@zulip.com> <showell30@yahoo.com>
Steve Howell <showell@zulip.com> <showell@yahoo.com>
Steve Howell <showell@zulip.com> <showell@zulipchat.com>
Steve Howell <showell@zulip.com> <steve@humbughq.com>
Steve Howell <showell@zulip.com> <steve@zulip.com>
Tim Abbott <tabbott@zulip.com> <tabbott@dropbox.com>
Tim Abbott <tabbott@zulip.com> <tabbott@humbughq.com>
Tim Abbott <tabbott@zulip.com> <tabbott@mit.edu>
Tim Abbott <tabbott@zulip.com> <tabbott@zulipchat.com>
Vishnu KS <yo@vishnuks.com> <hackerkid@vishnuks.com>
Vishnu KS <yo@vishnuks.com> <yo@vishnuks.com>

View File

@@ -1 +0,0 @@
/static/third

View File

@@ -1,14 +0,0 @@
{
"source_directories": ["."],
"taint_models_path": [
"stubs/taint",
"zulip-py3-venv/lib/pyre_check/taint/"
],
"search_path": [
"stubs/",
"zulip-py3-venv/lib/pyre_check/stubs/"
],
"exclude": [
"/srv/zulip/zulip-py3-venv/.*"
]
}

View File

@@ -1 +0,0 @@
sonar.inclusions=**/*.py,**/*.html

66
.travis.yml Normal file
View File

@@ -0,0 +1,66 @@
# See https://zulip.readthedocs.io/en/latest/testing/continuous-integration.html for
# high-level documentation on our Travis CI setup.
dist: xenial
install:
# Disable sometimes-broken sources.list in Travis base images
- sudo rm -vf /etc/apt/sources.list.d/*
- sudo apt-get update
# Disable Travis CI's built-in NVM installation
- mispipe "mv ~/.nvm ~/.travis-nvm-disabled" ts
# Install codecov, the library for the code coverage reporting tool we use
# With a retry to minimize impact of transient networking errors.
- mispipe "pip install codecov" ts || mispipe "pip install codecov" ts
# This is the main setup job for the test suite
- mispipe "tools/ci/setup-$TEST_SUITE" ts
# Clean any caches that are not in use to avoid our cache
# becoming huge.
- mispipe "scripts/lib/clean-unused-caches --verbose --threshold 0" ts
script:
# We unset GEM_PATH here as a hack to work around Travis CI having
# broken running their system puppet with Ruby. See
# https://travis-ci.org/zulip/zulip/jobs/240120991 for an example traceback.
- unset GEM_PATH
- mispipe "./tools/ci/$TEST_SUITE" ts
cache:
yarn: true
apt: false
directories:
- $HOME/zulip-venv-cache
- $HOME/zulip-npm-cache
- $HOME/zulip-emoji-cache
- $HOME/node
- $HOME/misc
env:
global:
- BOTO_CONFIG=/nonexistent
language: python
# Our test suites generally run on Python 3.5, the version in
# Ubuntu 16.04 xenial, which is the oldest OS release we support.
matrix:
include:
# Travis will actually run the jobs in the order they're listed here;
# that doesn't seem to be documented, but it's what we see empirically.
# We only get 4 jobs running at a time, so we try to make the first few
# the most likely to break.
- python: "3.5"
env: TEST_SUITE=production
# Other suites moved to CircleCI -- see .circleci/.
sudo: required
addons:
artifacts:
paths:
# Casper debugging data (screenshots, etc.) is super useful for
# debugging test flakes.
- $(ls var/casper/* | tr "\n" ":")
- $(ls /tmp/zulip-test-event-log/* | tr "\n" ":")
postgresql: "9.5"
apt:
packages:
- moreutils
after_success:
- codecov

View File

@@ -101,5 +101,5 @@ This Code of Conduct is adapted from the
[Citizen Code of Conduct](http://citizencodeofconduct.org/) and the
[Django Code of Conduct](https://www.djangoproject.com/conduct/), and is
under a
[Creative Commons BY-SA](https://creativecommons.org/licenses/by-sa/4.0/)
[Creative Commons BY-SA](http://creativecommons.org/licenses/by-sa/4.0/)
license.

View File

@@ -32,8 +32,8 @@ needs doing:
[desktop app](https://github.com/zulip/zulip-desktop).
* Building out our
[Python API and bots](https://github.com/zulip/python-zulip-api) framework.
* [Writing an integration](https://zulip.com/api/integrations-overview).
* Improving our [user](https://zulip.com/help/) or
* [Writing an integration](https://zulipchat.com/api/integrations-overview).
* Improving our [user](https://zulipchat.com/help/) or
[developer](https://zulip.readthedocs.io/en/latest/) documentation.
* [Reviewing code](https://zulip.readthedocs.io/en/latest/contributing/code-reviewing.html)
and manually testing pull requests.
@@ -47,7 +47,7 @@ don't require touching the codebase at all. We list a few of them below:
* [Translating](https://zulip.readthedocs.io/en/latest/translating/translating.html)
Zulip.
* [Outreach](#zulip-outreach): Star us on GitHub, upvote us
on product comparison sites, or write for [the Zulip blog](https://blog.zulip.org/).
on product comparison sites, or write for [the Zulip blog](http://blog.zulip.org/).
## Your first (codebase) contribution
@@ -75,6 +75,8 @@ to help.
[#git help](https://chat.zulip.org/#narrow/stream/44-git-help) if
you run into any troubles. Be sure to check out the
[extremely useful Zulip-specific tools page](https://zulip.readthedocs.io/en/latest/git/zulip-tools.html).
* Sign the
[Dropbox Contributor License Agreement](https://opensource.dropbox.com/cla/).
### Picking an issue
@@ -170,8 +172,9 @@ labels.
## What makes a great Zulip contributor?
Zulip has a lot of experience working with new contributors. In our
experience, these are the best predictors of success:
Zulip runs a lot of [internship programs](#internship-programs), so we have
a lot of experience with new contributors. In our experience, these are the
best predictors of success:
* Posting good questions. This generally means explaining your current
understanding, saying what you've done or tried so far, and including
@@ -191,8 +194,8 @@ experience, these are the best predictors of success:
able to address things within a few days.
* Being helpful and friendly on chat.zulip.org.
These are also the main criteria we use to select candidates for all
of our outreach programs.
These are also the main criteria we use to select interns for all of our
internship programs.
## Reporting issues
@@ -213,9 +216,8 @@ and how to reproduce it if known, your browser/OS if relevant, and a
if appropriate.
**Reporting security issues**. Please do not report security issues
publicly, including on public streams on chat.zulip.org. You can
email security@zulip.com. We create a CVE for every security
issue in our released software.
publicly, including on public streams on chat.zulip.org. You can email
zulip-security@googlegroups.com. We create a CVE for every security issue.
## User feedback
@@ -230,7 +232,7 @@ to:
* Pros and cons: What are the pros and cons of Zulip for your organization,
and the pros and cons of other products you are evaluating?
* Features: What are the features that are most important for your
organization? In the best-case scenario, what would your chat solution do
organization? In the best case scenario, what would your chat solution do
for you?
* Onboarding: If you remember it, what was your impression during your first
few minutes of using Zulip? What did you notice, and how did you feel? Was
@@ -238,20 +240,21 @@ to:
* Organization: What does your organization do? How big is the organization?
A link to your organization's website?
## Outreach programs
## Internship programs
Zulip participates in [Google Summer of Code
(GSoC)](https://developers.google.com/open-source/gsoc/) every year.
In the past, we've also participated in
[Outreachy](https://www.outreachy.org/), [Google
Code-In](https://developers.google.com/open-source/gci/), and hosted
summer interns from Harvard, MIT, and Stanford.
Zulip runs internship programs with
[Outreachy](https://www.outreachy.org/),
[Google Summer of Code (GSoC)](https://developers.google.com/open-source/gsoc/)
[1], and the
[MIT Externship program](https://alum.mit.edu/students/NetworkwithAlumni/ExternshipProgram),
and has in the past taken summer interns from Harvard, MIT, and
Stanford.
While each third-party program has its own rules and requirements, the
Zulip community's approaches all of these programs with these ideas in
mind:
* We try to make the application process as valuable for the applicant as
possible. Expect high-quality code reviews, a supportive community, and
possible. Expect high quality code reviews, a supportive community, and
publicly viewable patches you can link to from your resume, regardless of
whether you are selected.
* To apply, you'll have to submit at least one pull request to a Zulip
@@ -265,22 +268,26 @@ mind:
application to makes mistakes in your first few PRs as long as your
work improves.
Most of our outreach program participants end up sticking around the
project long-term, and many have become core team members, maintaining
important parts of the project. We hope you apply!
Zulip also participates in
[Google Code-In](https://developers.google.com/open-source/gci/). Our
selection criteria for Finalists and Grand Prize Winners is the same as our
selection criteria for interns above.
Most of our interns end up sticking around the project long-term, and many
quickly become core team members. We hope you apply!
### Google Summer of Code
The largest outreach program Zulip participates in is GSoC (14
students in 2017; 11 in 2018; 17 in 2019). While we don't control how
many slots Google allocates to Zulip, we hope to mentor a similar
number of students in future summers.
GSoC is by far the largest of our internship programs (14 students in
2017; 11 in 2018; 17 in 2019). While we don't control how many slots
Google allocates to Zulip, we hope to mentor a similar number of
students in future summers.
If you're reading this well before the application deadline and want
to make your application strong, we recommend getting involved in the
community and fixing issues in Zulip now. Having good contributions
and building a reputation for doing good work is the best way to have
a strong application. About half of Zulip's GSoC students for Summer
and building a reputation for doing good work is best way to have a
strong application. About half of Zulip's GSoC students for Summer
2017 had made significant contributions to the project by February
2017, and about half had not. Our
[GSoC project ideas page][gsoc-guide] has lots more details on how
@@ -299,6 +306,10 @@ for ZSoC, we'll contact you when the GSoC results are announced.
[gsoc-guide]: https://zulip.readthedocs.io/en/latest/overview/gsoc-ideas.html
[gsoc-faq]: https://developers.google.com/open-source/gsoc/faq
[1] Formally, [GSoC isn't an internship][gsoc-faq], but it is similar
enough that we're treating it as such for the purposes of this
documentation.
## Zulip Outreach
**Upvoting Zulip**. Upvotes and reviews make a big difference in the public
@@ -333,7 +344,7 @@ have been using Zulip for a while and want to contribute more.
about a technical aspect of Zulip can be a great way to spread the word
about Zulip.
We also occasionally [publish](https://blog.zulip.org/) long-form
We also occasionally [publish](http://blog.zulip.org/) longer form
articles related to Zulip. Our posts typically get tens of thousands
of views, and we always have good ideas for blog posts that we can
outline but don't have time to write. If you are an experienced writer

View File

@@ -7,9 +7,17 @@
# Install hunspell, zulip stop words, and run zulip database
# init.
FROM groonga/pgroonga:latest-alpine-10-slim
RUN apk add -U --no-cache hunspell-en
RUN ln -sf /usr/share/hunspell/en_US.dic /usr/local/share/postgresql/tsearch_data/en_us.dict && ln -sf /usr/share/hunspell/en_US.aff /usr/local/share/postgresql/tsearch_data/en_us.affix
COPY puppet/zulip/files/postgresql/zulip_english.stop /usr/local/share/postgresql/tsearch_data/zulip_english.stop
FROM postgres:10
COPY puppet/zulip/files/postgresql/zulip_english.stop /usr/share/postgresql/$PG_MAJOR/tsearch_data/zulip_english.stop
COPY scripts/setup/create-db.sql /docker-entrypoint-initdb.d/zulip-create-db.sql
COPY scripts/setup/create-pgroonga.sql /docker-entrypoint-initdb.d/zulip-create-pgroonga.sql
COPY scripts/setup/pgroonga-debian.asc /tmp
RUN apt-key add /tmp/pgroonga-debian.asc \
&& echo "deb http://packages.groonga.org/debian/ stretch main" > /etc/apt/sources.list.d/zulip.list \
&& apt-get update \
&& DEBIAN_FRONTEND=noninteractive apt-get install --no-install-recommends -y \
hunspell-en-us \
postgresql-${PG_MAJOR}-pgroonga \
&& ln -sf /var/cache/postgresql/dicts/en_us.dict "/usr/share/postgresql/$PG_MAJOR/tsearch_data/en_us.dict" \
&& ln -sf /var/cache/postgresql/dicts/en_us.affix "/usr/share/postgresql/$PG_MAJOR/tsearch_data/en_us.affix" \
&& rm -rf /var/lib/apt/lists/*

View File

@@ -1,4 +1,4 @@
Copyright 2011-2020 Dropbox, Inc., Kandra Labs, Inc., and contributors
Copyright 2011-2018 Dropbox, Inc., Kandra Labs, Inc., and contributors
Apache License
Version 2.0, January 2004

View File

@@ -29,12 +29,12 @@ You might be interested in:
* **Contributing code**. Check out our
[guide for new contributors](https://zulip.readthedocs.io/en/latest/overview/contributing.html)
to get started. Zulip prides itself on maintaining a clean and
to get started. Zulip prides itself on maintaining a clean and
well-tested codebase, and a stock of hundreds of
[beginner-friendly issues][beginner-friendly].
* **Contributing non-code**.
[Report an issue](https://zulip.readthedocs.io/en/latest/overview/contributing.html#reporting-issues),
[Report an issue](https://zulip.readthedocs.io/en/latest/overview/contributing.html#reporting-issue),
[translate](https://zulip.readthedocs.io/en/latest/translating/translating.html) Zulip
into your language,
[write](https://zulip.readthedocs.io/en/latest/overview/contributing.html#zulip-outreach)
@@ -51,25 +51,30 @@ You might be interested in:
the
[Zulip community server](https://zulip.readthedocs.io/en/latest/contributing/chat-zulip-org.html). We
also recommend reading Zulip for
[open source](https://zulip.com/for/open-source/), Zulip for
[companies](https://zulip.com/for/companies/), or Zulip for
[working groups and part time communities](https://zulip.com/for/working-groups-and-communities/).
[open source](https://zulipchat.com/for/open-source/), Zulip for
[companies](https://zulipchat.com/for/companies/), or Zulip for
[working groups and part time communities](https://zulipchat.com/for/working-groups-and-communities/).
* **Running a Zulip server**. Use a preconfigured [Digital Ocean droplet](https://marketplace.digitalocean.com/apps/zulip),
[install Zulip](https://zulip.readthedocs.io/en/stable/production/install.html)
directly, or use Zulip's
experimental [Docker image](https://zulip.readthedocs.io/en/latest/production/deployment.html#zulip-in-docker).
Commercial support is available; see <https://zulip.com/plans> for details.
Commercial support is available; see <https://zulipchat.com/plans> for details.
* **Using Zulip without setting up a server**. <https://zulip.com>
offers free and commercial hosting, including providing our paid
plan for free to fellow open source projects.
* **Using Zulip without setting up a server**. <https://zulipchat.com> offers
free and commercial hosting.
* **Participating in [outreach
programs](https://zulip.readthedocs.io/en/latest/overview/contributing.html#outreach-programs)**
like Google Summer of Code.
* **Applying for a Zulip internship**. Zulip runs internship programs with
[Outreachy](https://www.outreachy.org/),
[Google Summer of Code](https://developers.google.com/open-source/gsoc/),
and the
[MIT Externship program](https://alum.mit.edu/students/NetworkwithAlumni/ExternshipProgram). Zulip
also participates in
[Google Code-In](https://developers.google.com/open-source/gci/). More
information is available
[here](https://zulip.readthedocs.io/en/latest/overview/contributing.html#internship-programs).
You may also be interested in reading our [blog](https://blog.zulip.org/) or
You may also be interested in reading our [blog](http://blog.zulip.org/) or
following us on [twitter](https://twitter.com/zulip).
Zulip is distributed under the
[Apache 2.0](https://github.com/zulip/zulip/blob/master/LICENSE) license.

View File

@@ -1,28 +0,0 @@
# Security Policy
Security announcements are sent to zulip-announce@googlegroups.com,
so you should subscribe if you are running Zulip in production.
## Reporting a Vulnerability
We love responsible reports of (potential) security issues in Zulip,
whether in the latest release or our development branch.
Our security contact is security@zulip.com. Reporters should expect a
response within 24 hours.
Please include details on the issue and how you'd like to be credited
in our release notes when we publish the fix.
Our [security
model](https://zulip.readthedocs.io/en/latest/production/security-model.html)
document may be a helpful resource.
## Supported Versions
Zulip provides security support for the latest major release, in the
form of minor security/maintenance releases.
We work hard to make
[upgrades](https://zulip.readthedocs.io/en/latest/production/upgrade-or-modify.html#upgrading-to-a-release)
reliable, so that there's no reason to run older major releases.

8
Vagrantfile vendored
View File

@@ -114,7 +114,13 @@ Vagrant.configure(VAGRANTFILE_API_VERSION) do |config|
end
config.vm.provider "virtualbox" do |vb, override|
override.vm.box = "hashicorp/bionic64"
override.vm.box = "ubuntu/bionic64"
# An unnecessary log file gets generated when running vagrant up for the
# first time with the Ubuntu Bionic box. This looks like it is being
# caused upstream by the base box containing a Vagrantfile with a similar
# line to the one below.
# see https://github.com/hashicorp/vagrant/issues/9425
vb.customize [ "modifyvm", :id, "--uartmode1", "disconnected" ]
# It's possible we can get away with just 1.5GB; more testing needed
vb.memory = vm_memory
vb.cpus = vm_num_cpus

View File

@@ -1,35 +1,22 @@
import logging
import time
from collections import OrderedDict, defaultdict
from datetime import datetime, timedelta
from typing import Callable, Dict, Optional, Sequence, Tuple, Type, Union
import logging
from typing import Callable, Dict, List, \
Optional, Tuple, Type, Union
from django.conf import settings
from django.db import connection
from django.db.models import F
from psycopg2.sql import SQL, Composable, Identifier, Literal
from analytics.models import (
BaseCount,
FillState,
InstallationCount,
RealmCount,
StreamCount,
UserCount,
installation_epoch,
last_successful_fill,
)
from analytics.models import BaseCount, \
FillState, InstallationCount, RealmCount, StreamCount, \
UserCount, installation_epoch, last_successful_fill
from zerver.lib.logging_util import log_to_file
from zerver.lib.timestamp import ceiling_to_day, ceiling_to_hour, floor_to_hour, verify_UTC
from zerver.models import (
Message,
Realm,
RealmAuditLog,
Stream,
UserActivityInterval,
UserProfile,
models,
)
from zerver.lib.timestamp import ceiling_to_day, \
ceiling_to_hour, floor_to_hour, verify_UTC
from zerver.models import Message, Realm, RealmAuditLog, \
Stream, UserActivityInterval, UserProfile, models
## Logging setup ##
@@ -52,7 +39,7 @@ class CountStat:
self.data_collector = data_collector
# might have to do something different for bitfields
if frequency not in self.FREQUENCIES:
raise AssertionError(f"Unknown frequency: {frequency}")
raise AssertionError("Unknown frequency: %s" % (frequency,))
self.frequency = frequency
if interval is not None:
self.interval = interval
@@ -62,7 +49,7 @@ class CountStat:
self.interval = timedelta(days=1)
def __str__(self) -> str:
return f"<CountStat: {self.property}>"
return "<CountStat: %s>" % (self.property,)
class LoggingCountStat(CountStat):
def __init__(self, property: str, output_table: Type[BaseCount], frequency: str) -> None:
@@ -70,39 +57,29 @@ class LoggingCountStat(CountStat):
class DependentCountStat(CountStat):
def __init__(self, property: str, data_collector: 'DataCollector', frequency: str,
interval: Optional[timedelta] = None, dependencies: Sequence[str] = []) -> None:
interval: Optional[timedelta]=None, dependencies: List[str]=[]) -> None:
CountStat.__init__(self, property, data_collector, frequency, interval=interval)
self.dependencies = dependencies
class DataCollector:
def __init__(self, output_table: Type[BaseCount],
pull_function: Optional[Callable[[str, datetime, datetime, Optional[Realm]], int]]) -> None:
pull_function: Optional[Callable[[str, datetime, datetime], int]]) -> None:
self.output_table = output_table
self.pull_function = pull_function
## CountStat-level operations ##
def process_count_stat(stat: CountStat, fill_to_time: datetime,
realm: Optional[Realm]=None) -> None:
# TODO: The realm argument is not yet supported, in that we don't
# have a solution for how to update FillState if it is passed. It
# exists solely as partial plumbing for when we do fully implement
# doing single-realm analytics runs for use cases like data import.
#
# Also, note that for the realm argument to be properly supported,
# the CountStat object passed in needs to have come from
# E.g. get_count_stats(realm), i.e. have the realm_id already
# entered into the SQL query defined by the CountState object.
def process_count_stat(stat: CountStat, fill_to_time: datetime) -> None:
if stat.frequency == CountStat.HOUR:
time_increment = timedelta(hours=1)
elif stat.frequency == CountStat.DAY:
time_increment = timedelta(days=1)
else:
raise AssertionError(f"Unknown frequency: {stat.frequency}")
raise AssertionError("Unknown frequency: %s" % (stat.frequency,))
verify_UTC(fill_to_time)
if floor_to_hour(fill_to_time) != fill_to_time:
raise ValueError(f"fill_to_time must be on an hour boundary: {fill_to_time}")
raise ValueError("fill_to_time must be on an hour boundary: %s" % (fill_to_time,))
fill_state = FillState.objects.filter(property=stat.property).first()
if fill_state is None:
@@ -110,37 +87,37 @@ def process_count_stat(stat: CountStat, fill_to_time: datetime,
fill_state = FillState.objects.create(property=stat.property,
end_time=currently_filled,
state=FillState.DONE)
logger.info("INITIALIZED %s %s", stat.property, currently_filled)
logger.info("INITIALIZED %s %s" % (stat.property, currently_filled))
elif fill_state.state == FillState.STARTED:
logger.info("UNDO START %s %s", stat.property, fill_state.end_time)
logger.info("UNDO START %s %s" % (stat.property, fill_state.end_time))
do_delete_counts_at_hour(stat, fill_state.end_time)
currently_filled = fill_state.end_time - time_increment
do_update_fill_state(fill_state, currently_filled, FillState.DONE)
logger.info("UNDO DONE %s", stat.property)
logger.info("UNDO DONE %s" % (stat.property,))
elif fill_state.state == FillState.DONE:
currently_filled = fill_state.end_time
else:
raise AssertionError(f"Unknown value for FillState.state: {fill_state.state}.")
raise AssertionError("Unknown value for FillState.state: %s." % (fill_state.state,))
if isinstance(stat, DependentCountStat):
for dependency in stat.dependencies:
dependency_fill_time = last_successful_fill(dependency)
if dependency_fill_time is None:
logger.warning("DependentCountStat %s run before dependency %s.",
stat.property, dependency)
logger.warning("DependentCountStat %s run before dependency %s." %
(stat.property, dependency))
return
fill_to_time = min(fill_to_time, dependency_fill_time)
currently_filled = currently_filled + time_increment
while currently_filled <= fill_to_time:
logger.info("START %s %s", stat.property, currently_filled)
logger.info("START %s %s" % (stat.property, currently_filled))
start = time.time()
do_update_fill_state(fill_state, currently_filled, FillState.STARTED)
do_fill_count_stat_at_hour(stat, currently_filled, realm)
do_fill_count_stat_at_hour(stat, currently_filled)
do_update_fill_state(fill_state, currently_filled, FillState.DONE)
end = time.time()
currently_filled = currently_filled + time_increment
logger.info("DONE %s (%dms)", stat.property, (end-start)*1000)
logger.info("DONE %s (%dms)" % (stat.property, (end-start)*1000))
def do_update_fill_state(fill_state: FillState, end_time: datetime, state: int) -> None:
fill_state.end_time = end_time
@@ -149,15 +126,15 @@ def do_update_fill_state(fill_state: FillState, end_time: datetime, state: int)
# We assume end_time is valid (e.g. is on a day or hour boundary as appropriate)
# and is timezone aware. It is the caller's responsibility to enforce this!
def do_fill_count_stat_at_hour(stat: CountStat, end_time: datetime, realm: Optional[Realm]=None) -> None:
def do_fill_count_stat_at_hour(stat: CountStat, end_time: datetime) -> None:
start_time = end_time - stat.interval
if not isinstance(stat, LoggingCountStat):
timer = time.time()
assert(stat.data_collector.pull_function is not None)
rows_added = stat.data_collector.pull_function(stat.property, start_time, end_time, realm)
logger.info("%s run pull_function (%dms/%sr)",
stat.property, (time.time()-timer)*1000, rows_added)
do_aggregate_to_summary_table(stat, end_time, realm)
rows_added = stat.data_collector.pull_function(stat.property, start_time, end_time)
logger.info("%s run pull_function (%dms/%sr)" %
(stat.property, (time.time()-timer)*1000, rows_added))
do_aggregate_to_summary_table(stat, end_time)
def do_delete_counts_at_hour(stat: CountStat, end_time: datetime) -> None:
if isinstance(stat, LoggingCountStat):
@@ -170,76 +147,51 @@ def do_delete_counts_at_hour(stat: CountStat, end_time: datetime) -> None:
RealmCount.objects.filter(property=stat.property, end_time=end_time).delete()
InstallationCount.objects.filter(property=stat.property, end_time=end_time).delete()
def do_aggregate_to_summary_table(stat: CountStat, end_time: datetime,
realm: Optional[Realm]=None) -> None:
def do_aggregate_to_summary_table(stat: CountStat, end_time: datetime) -> None:
cursor = connection.cursor()
# Aggregate into RealmCount
output_table = stat.data_collector.output_table
if realm is not None:
realm_clause = SQL("AND zerver_realm.id = {}").format(Literal(realm.id))
else:
realm_clause = SQL("")
if output_table in (UserCount, StreamCount):
realmcount_query = SQL("""
realmcount_query = """
INSERT INTO analytics_realmcount
(realm_id, value, property, subgroup, end_time)
SELECT
zerver_realm.id, COALESCE(sum({output_table}.value), 0), %(property)s,
{output_table}.subgroup, %(end_time)s
zerver_realm.id, COALESCE(sum(%(output_table)s.value), 0), '%(property)s',
%(output_table)s.subgroup, %%(end_time)s
FROM zerver_realm
JOIN {output_table}
JOIN %(output_table)s
ON
zerver_realm.id = {output_table}.realm_id
zerver_realm.id = %(output_table)s.realm_id
WHERE
{output_table}.property = %(property)s AND
{output_table}.end_time = %(end_time)s
{realm_clause}
GROUP BY zerver_realm.id, {output_table}.subgroup
""").format(
output_table=Identifier(output_table._meta.db_table),
realm_clause=realm_clause,
)
%(output_table)s.property = '%(property)s' AND
%(output_table)s.end_time = %%(end_time)s
GROUP BY zerver_realm.id, %(output_table)s.subgroup
""" % {'output_table': output_table._meta.db_table,
'property': stat.property}
start = time.time()
cursor.execute(realmcount_query, {
'property': stat.property,
'end_time': end_time,
})
cursor.execute(realmcount_query, {'end_time': end_time})
end = time.time()
logger.info(
"%s RealmCount aggregation (%dms/%sr)",
stat.property, (end - start) * 1000, cursor.rowcount,
)
if realm is None:
# Aggregate into InstallationCount. Only run if we just
# processed counts for all realms.
#
# TODO: Add support for updating installation data after
# changing an individual realm's values.
installationcount_query = SQL("""
INSERT INTO analytics_installationcount
(value, property, subgroup, end_time)
SELECT
sum(value), %(property)s, analytics_realmcount.subgroup, %(end_time)s
FROM analytics_realmcount
WHERE
property = %(property)s AND
end_time = %(end_time)s
GROUP BY analytics_realmcount.subgroup
""")
start = time.time()
cursor.execute(installationcount_query, {
'property': stat.property,
'end_time': end_time,
})
end = time.time()
logger.info(
"%s InstallationCount aggregation (%dms/%sr)",
stat.property, (end - start) * 1000, cursor.rowcount,
)
logger.info("%s RealmCount aggregation (%dms/%sr)" % (
stat.property, (end - start) * 1000, cursor.rowcount))
# Aggregate into InstallationCount
installationcount_query = """
INSERT INTO analytics_installationcount
(value, property, subgroup, end_time)
SELECT
sum(value), '%(property)s', analytics_realmcount.subgroup, %%(end_time)s
FROM analytics_realmcount
WHERE
property = '%(property)s' AND
end_time = %%(end_time)s
GROUP BY analytics_realmcount.subgroup
""" % {'property': stat.property}
start = time.time()
cursor.execute(installationcount_query, {'end_time': end_time})
end = time.time()
logger.info("%s InstallationCount aggregation (%dms/%sr)" % (
stat.property, (end - start) * 1000, cursor.rowcount))
cursor.close()
## Utility functions called from outside counts.py ##
@@ -248,9 +200,6 @@ def do_aggregate_to_summary_table(stat: CountStat, end_time: datetime,
def do_increment_logging_stat(zerver_object: Union[Realm, UserProfile, Stream], stat: CountStat,
subgroup: Optional[Union[str, int, bool]], event_time: datetime,
increment: int=1) -> None:
if not increment:
return
table = stat.data_collector.output_table
if table == RealmCount:
id_args = {'realm': zerver_object}
@@ -287,71 +236,46 @@ def do_drop_single_stat(property: str) -> None:
## DataCollector-level operations ##
QueryFn = Callable[[Dict[str, Composable]], Composable]
def do_pull_by_sql_query(
property: str,
start_time: datetime,
end_time: datetime,
query: QueryFn,
group_by: Optional[Tuple[models.Model, str]],
) -> int:
def do_pull_by_sql_query(property: str, start_time: datetime, end_time: datetime, query: str,
group_by: Optional[Tuple[models.Model, str]]) -> int:
if group_by is None:
subgroup = SQL('NULL')
group_by_clause = SQL('')
subgroup = 'NULL'
group_by_clause = ''
else:
subgroup = Identifier(group_by[0]._meta.db_table, group_by[1])
group_by_clause = SQL(', {}').format(subgroup)
subgroup = '%s.%s' % (group_by[0]._meta.db_table, group_by[1])
group_by_clause = ', ' + subgroup
# We do string replacement here because cursor.execute will reject a
# group_by_clause given as a param.
# We pass in the datetimes as params to cursor.execute so that we don't have to
# think about how to convert python datetimes to SQL datetimes.
query_ = query({
'subgroup': subgroup,
'group_by_clause': group_by_clause,
})
query_ = query % {'property': property, 'subgroup': subgroup,
'group_by_clause': group_by_clause}
cursor = connection.cursor()
cursor.execute(query_, {
'property': property,
'time_start': start_time,
'time_end': end_time,
})
cursor.execute(query_, {'time_start': start_time, 'time_end': end_time})
rowcount = cursor.rowcount
cursor.close()
return rowcount
def sql_data_collector(
output_table: Type[BaseCount],
query: QueryFn,
group_by: Optional[Tuple[models.Model, str]],
) -> DataCollector:
def pull_function(property: str, start_time: datetime, end_time: datetime,
realm: Optional[Realm] = None) -> int:
# The pull function type needs to accept a Realm argument
# because the 'minutes_active::day' CountStat uses
# DataCollector directly for do_pull_minutes_active, which
# requires the realm argument. We ignore it here, because the
# realm should have been already encoded in the `query` we're
# passed.
def sql_data_collector(output_table: Type[BaseCount], query: str,
group_by: Optional[Tuple[models.Model, str]]) -> DataCollector:
def pull_function(property: str, start_time: datetime, end_time: datetime) -> int:
return do_pull_by_sql_query(property, start_time, end_time, query, group_by)
return DataCollector(output_table, pull_function)
def do_pull_minutes_active(property: str, start_time: datetime, end_time: datetime,
realm: Optional[Realm] = None) -> int:
def do_pull_minutes_active(property: str, start_time: datetime, end_time: datetime) -> int:
user_activity_intervals = UserActivityInterval.objects.filter(
end__gt=start_time, start__lt=end_time,
end__gt=start_time, start__lt=end_time
).select_related(
'user_profile',
'user_profile'
).values_list(
'user_profile_id', 'user_profile__realm_id', 'start', 'end')
seconds_active: Dict[Tuple[int, int], float] = defaultdict(float)
seconds_active = defaultdict(float) # type: Dict[Tuple[int, int], float]
for user_id, realm_id, interval_start, interval_end in user_activity_intervals:
if realm is None or realm.id == realm_id:
start = max(start_time, interval_start)
end = min(end_time, interval_end)
seconds_active[(user_id, realm_id)] += (end - start).total_seconds()
start = max(start_time, interval_start)
end = min(end_time, interval_end)
seconds_active[(user_id, realm_id)] += (end - start).total_seconds()
rows = [UserCount(user_id=ids[0], realm_id=ids[1], property=property,
end_time=end_time, value=int(seconds // 60))
@@ -359,39 +283,28 @@ def do_pull_minutes_active(property: str, start_time: datetime, end_time: dateti
UserCount.objects.bulk_create(rows)
return len(rows)
def count_message_by_user_query(realm: Optional[Realm]) -> QueryFn:
if realm is None:
realm_clause = SQL("")
else:
realm_clause = SQL("zerver_userprofile.realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL("""
count_message_by_user_query = """
INSERT INTO analytics_usercount
(user_id, realm_id, value, property, subgroup, end_time)
SELECT
zerver_userprofile.id, zerver_userprofile.realm_id, count(*),
%(property)s, {subgroup}, %(time_end)s
'%(property)s', %(subgroup)s, %%(time_end)s
FROM zerver_userprofile
JOIN zerver_message
ON
zerver_userprofile.id = zerver_message.sender_id
WHERE
zerver_userprofile.date_joined < %(time_end)s AND
zerver_message.date_sent >= %(time_start)s AND
{realm_clause}
zerver_message.date_sent < %(time_end)s
GROUP BY zerver_userprofile.id {group_by_clause}
""").format(**kwargs, realm_clause=realm_clause)
zerver_userprofile.date_joined < %%(time_end)s AND
zerver_message.date_sent >= %%(time_start)s AND
zerver_message.date_sent < %%(time_end)s
GROUP BY zerver_userprofile.id %(group_by_clause)s
"""
# Note: ignores the group_by / group_by_clause.
def count_message_type_by_user_query(realm: Optional[Realm]) -> QueryFn:
if realm is None:
realm_clause = SQL("")
else:
realm_clause = SQL("zerver_userprofile.realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL("""
count_message_type_by_user_query = """
INSERT INTO analytics_usercount
(realm_id, user_id, value, property, subgroup, end_time)
SELECT realm_id, id, SUM(count) AS value, %(property)s, message_type, %(time_end)s
SELECT realm_id, id, SUM(count) AS value, '%(property)s', message_type, %%(time_end)s
FROM
(
SELECT zerver_userprofile.realm_id, zerver_userprofile.id, count(*),
@@ -409,9 +322,8 @@ def count_message_type_by_user_query(realm: Optional[Realm]) -> QueryFn:
JOIN zerver_message
ON
zerver_userprofile.id = zerver_message.sender_id AND
zerver_message.date_sent >= %(time_start)s AND
{realm_clause}
zerver_message.date_sent < %(time_end)s
zerver_message.date_sent >= %%(time_start)s AND
zerver_message.date_sent < %%(time_end)s
JOIN zerver_recipient
ON
zerver_message.recipient_id = zerver_recipient.id
@@ -423,22 +335,17 @@ def count_message_type_by_user_query(realm: Optional[Realm]) -> QueryFn:
zerver_recipient.type, zerver_stream.invite_only
) AS subquery
GROUP BY realm_id, id, message_type
""").format(**kwargs, realm_clause=realm_clause)
"""
# This query joins to the UserProfile table since all current queries that
# use this also subgroup on UserProfile.is_bot. If in the future there is a
# stat that counts messages by stream and doesn't need the UserProfile
# table, consider writing a new query for efficiency.
def count_message_by_stream_query(realm: Optional[Realm]) -> QueryFn:
if realm is None:
realm_clause = SQL("")
else:
realm_clause = SQL("zerver_stream.realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL("""
count_message_by_stream_query = """
INSERT INTO analytics_streamcount
(stream_id, realm_id, value, property, subgroup, end_time)
SELECT
zerver_stream.id, zerver_stream.realm_id, count(*), %(property)s, {subgroup}, %(time_end)s
zerver_stream.id, zerver_stream.realm_id, count(*), '%(property)s', %(subgroup)s, %%(time_end)s
FROM zerver_stream
JOIN zerver_recipient
ON
@@ -450,61 +357,48 @@ def count_message_by_stream_query(realm: Optional[Realm]) -> QueryFn:
ON
zerver_message.sender_id = zerver_userprofile.id
WHERE
zerver_stream.date_created < %(time_end)s AND
zerver_stream.date_created < %%(time_end)s AND
zerver_recipient.type = 2 AND
zerver_message.date_sent >= %(time_start)s AND
{realm_clause}
zerver_message.date_sent < %(time_end)s
GROUP BY zerver_stream.id {group_by_clause}
""").format(**kwargs, realm_clause=realm_clause)
zerver_message.date_sent >= %%(time_start)s AND
zerver_message.date_sent < %%(time_end)s
GROUP BY zerver_stream.id %(group_by_clause)s
"""
# Hardcodes the query needed by active_users:is_bot:day, since that is
# currently the only stat that uses this.
def count_user_by_realm_query(realm: Optional[Realm]) -> QueryFn:
if realm is None:
realm_clause = SQL("")
else:
realm_clause = SQL("zerver_userprofile.realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL("""
count_user_by_realm_query = """
INSERT INTO analytics_realmcount
(realm_id, value, property, subgroup, end_time)
SELECT
zerver_realm.id, count(*), %(property)s, {subgroup}, %(time_end)s
zerver_realm.id, count(*),'%(property)s', %(subgroup)s, %%(time_end)s
FROM zerver_realm
JOIN zerver_userprofile
ON
zerver_realm.id = zerver_userprofile.realm_id
WHERE
zerver_realm.date_created < %(time_end)s AND
zerver_userprofile.date_joined >= %(time_start)s AND
zerver_userprofile.date_joined < %(time_end)s AND
{realm_clause}
zerver_realm.date_created < %%(time_end)s AND
zerver_userprofile.date_joined >= %%(time_start)s AND
zerver_userprofile.date_joined < %%(time_end)s AND
zerver_userprofile.is_active = TRUE
GROUP BY zerver_realm.id {group_by_clause}
""").format(**kwargs, realm_clause=realm_clause)
GROUP BY zerver_realm.id %(group_by_clause)s
"""
# Currently hardcodes the query needed for active_users_audit:is_bot:day.
# Assumes that a user cannot have two RealmAuditLog entries with the same event_time and
# event_type in [RealmAuditLog.USER_CREATED, USER_DEACTIVATED, etc].
# In particular, it's important to ensure that migrations don't cause that to happen.
def check_realmauditlog_by_user_query(realm: Optional[Realm]) -> QueryFn:
if realm is None:
realm_clause = SQL("")
else:
realm_clause = SQL("realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL("""
check_realmauditlog_by_user_query = """
INSERT INTO analytics_usercount
(user_id, realm_id, value, property, subgroup, end_time)
SELECT
ral1.modified_user_id, ral1.realm_id, 1, %(property)s, {subgroup}, %(time_end)s
ral1.modified_user_id, ral1.realm_id, 1, '%(property)s', %(subgroup)s, %%(time_end)s
FROM zerver_realmauditlog ral1
JOIN (
SELECT modified_user_id, max(event_time) AS max_event_time
FROM zerver_realmauditlog
WHERE
event_type in ({user_created}, {user_activated}, {user_deactivated}, {user_reactivated}) AND
{realm_clause}
event_time < %(time_end)s
event_time < %%(time_end)s
GROUP BY modified_user_id
) ral2
ON
@@ -515,180 +409,133 @@ def check_realmauditlog_by_user_query(realm: Optional[Realm]) -> QueryFn:
ral1.modified_user_id = zerver_userprofile.id
WHERE
ral1.event_type in ({user_created}, {user_activated}, {user_reactivated})
""").format(
**kwargs,
user_created=Literal(RealmAuditLog.USER_CREATED),
user_activated=Literal(RealmAuditLog.USER_ACTIVATED),
user_deactivated=Literal(RealmAuditLog.USER_DEACTIVATED),
user_reactivated=Literal(RealmAuditLog.USER_REACTIVATED),
realm_clause=realm_clause,
)
""".format(user_created=RealmAuditLog.USER_CREATED,
user_activated=RealmAuditLog.USER_ACTIVATED,
user_deactivated=RealmAuditLog.USER_DEACTIVATED,
user_reactivated=RealmAuditLog.USER_REACTIVATED)
def check_useractivityinterval_by_user_query(realm: Optional[Realm]) -> QueryFn:
if realm is None:
realm_clause = SQL("")
else:
realm_clause = SQL("zerver_userprofile.realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL("""
check_useractivityinterval_by_user_query = """
INSERT INTO analytics_usercount
(user_id, realm_id, value, property, subgroup, end_time)
SELECT
zerver_userprofile.id, zerver_userprofile.realm_id, 1, %(property)s, {subgroup}, %(time_end)s
zerver_userprofile.id, zerver_userprofile.realm_id, 1, '%(property)s', %(subgroup)s, %%(time_end)s
FROM zerver_userprofile
JOIN zerver_useractivityinterval
ON
zerver_userprofile.id = zerver_useractivityinterval.user_profile_id
WHERE
zerver_useractivityinterval.end >= %(time_start)s AND
{realm_clause}
zerver_useractivityinterval.start < %(time_end)s
GROUP BY zerver_userprofile.id {group_by_clause}
""").format(**kwargs, realm_clause=realm_clause)
zerver_useractivityinterval.end >= %%(time_start)s AND
zerver_useractivityinterval.start < %%(time_end)s
GROUP BY zerver_userprofile.id %(group_by_clause)s
"""
def count_realm_active_humans_query(realm: Optional[Realm]) -> QueryFn:
if realm is None:
realm_clause = SQL("")
else:
realm_clause = SQL("realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL("""
count_realm_active_humans_query = """
INSERT INTO analytics_realmcount
(realm_id, value, property, subgroup, end_time)
SELECT
usercount1.realm_id, count(*), %(property)s, NULL, %(time_end)s
usercount1.realm_id, count(*), '%(property)s', NULL, %%(time_end)s
FROM (
SELECT realm_id, user_id
FROM analytics_usercount
WHERE
property = 'active_users_audit:is_bot:day' AND
subgroup = 'false' AND
{realm_clause}
end_time = %(time_end)s
end_time = %%(time_end)s
) usercount1
JOIN (
SELECT realm_id, user_id
FROM analytics_usercount
WHERE
property = '15day_actives::day' AND
{realm_clause}
end_time = %(time_end)s
end_time = %%(time_end)s
) usercount2
ON
usercount1.user_id = usercount2.user_id
GROUP BY usercount1.realm_id
""").format(**kwargs, realm_clause=realm_clause)
"""
# Currently unused and untested
count_stream_by_realm_query = lambda kwargs: SQL("""
count_stream_by_realm_query = """
INSERT INTO analytics_realmcount
(realm_id, value, property, subgroup, end_time)
SELECT
zerver_realm.id, count(*), %(property)s, {subgroup}, %(time_end)s
zerver_realm.id, count(*), '%(property)s', %(subgroup)s, %%(time_end)s
FROM zerver_realm
JOIN zerver_stream
ON
zerver_realm.id = zerver_stream.realm_id AND
WHERE
zerver_realm.date_created < %(time_end)s AND
zerver_stream.date_created >= %(time_start)s AND
zerver_stream.date_created < %(time_end)s
GROUP BY zerver_realm.id {group_by_clause}
""").format(**kwargs)
zerver_realm.date_created < %%(time_end)s AND
zerver_stream.date_created >= %%(time_start)s AND
zerver_stream.date_created < %%(time_end)s
GROUP BY zerver_realm.id %(group_by_clause)s
"""
def get_count_stats(realm: Optional[Realm]=None) -> Dict[str, CountStat]:
## CountStat declarations ##
## CountStat declarations ##
count_stats_ = [
# Messages Sent stats
# Stats that count the number of messages sent in various ways.
# These are also the set of stats that read from the Message table.
count_stats_ = [
# Messages Sent stats
# Stats that count the number of messages sent in various ways.
# These are also the set of stats that read from the Message table.
CountStat('messages_sent:is_bot:hour',
sql_data_collector(UserCount, count_message_by_user_query(
realm), (UserProfile, 'is_bot')),
CountStat.HOUR),
CountStat('messages_sent:message_type:day',
sql_data_collector(
UserCount, count_message_type_by_user_query(realm), None),
CountStat.DAY),
CountStat('messages_sent:client:day',
sql_data_collector(UserCount, count_message_by_user_query(realm),
(Message, 'sending_client_id')), CountStat.DAY),
CountStat('messages_in_stream:is_bot:day',
sql_data_collector(StreamCount, count_message_by_stream_query(realm),
(UserProfile, 'is_bot')), CountStat.DAY),
CountStat('messages_sent:is_bot:hour',
sql_data_collector(UserCount, count_message_by_user_query, (UserProfile, 'is_bot')),
CountStat.HOUR),
CountStat('messages_sent:message_type:day',
sql_data_collector(UserCount, count_message_type_by_user_query, None), CountStat.DAY),
CountStat('messages_sent:client:day',
sql_data_collector(UserCount, count_message_by_user_query, (Message, 'sending_client_id')),
CountStat.DAY),
CountStat('messages_in_stream:is_bot:day',
sql_data_collector(StreamCount, count_message_by_stream_query, (UserProfile, 'is_bot')),
CountStat.DAY),
# Number of Users stats
# Stats that count the number of active users in the UserProfile.is_active sense.
# Number of Users stats
# Stats that count the number of active users in the UserProfile.is_active sense.
# 'active_users_audit:is_bot:day' is the canonical record of which users were
# active on which days (in the UserProfile.is_active sense).
# Important that this stay a daily stat, so that 'realm_active_humans::day' works as expected.
CountStat('active_users_audit:is_bot:day',
sql_data_collector(UserCount, check_realmauditlog_by_user_query(
realm), (UserProfile, 'is_bot')),
CountStat.DAY),
# 'active_users_audit:is_bot:day' is the canonical record of which users were
# active on which days (in the UserProfile.is_active sense).
# Important that this stay a daily stat, so that 'realm_active_humans::day' works as expected.
CountStat('active_users_audit:is_bot:day',
sql_data_collector(UserCount, check_realmauditlog_by_user_query, (UserProfile, 'is_bot')),
CountStat.DAY),
# Sanity check on 'active_users_audit:is_bot:day', and a archetype for future LoggingCountStats.
# In RealmCount, 'active_users_audit:is_bot:day' should be the partial
# sum sequence of 'active_users_log:is_bot:day', for any realm that
# started after the latter stat was introduced.
LoggingCountStat('active_users_log:is_bot:day', RealmCount, CountStat.DAY),
# Another sanity check on 'active_users_audit:is_bot:day'. Is only an
# approximation, e.g. if a user is deactivated between the end of the
# day and when this stat is run, they won't be counted. However, is the
# simplest of the three to inspect by hand.
CountStat('active_users:is_bot:day',
sql_data_collector(RealmCount, count_user_by_realm_query, (UserProfile, 'is_bot')),
CountStat.DAY, interval=TIMEDELTA_MAX),
# Important note: LoggingCountStat objects aren't passed the
# Realm argument, because by nature they have a logging
# structure, not a pull-from-database structure, so there's no
# way to compute them for a single realm after the fact (the
# use case for passing a Realm argument).
# User Activity stats
# Stats that measure user activity in the UserActivityInterval sense.
# Sanity check on 'active_users_audit:is_bot:day', and a archetype for future LoggingCountStats.
# In RealmCount, 'active_users_audit:is_bot:day' should be the partial
# sum sequence of 'active_users_log:is_bot:day', for any realm that
# started after the latter stat was introduced.
LoggingCountStat('active_users_log:is_bot:day',
RealmCount, CountStat.DAY),
# Another sanity check on 'active_users_audit:is_bot:day'. Is only an
# approximation, e.g. if a user is deactivated between the end of the
# day and when this stat is run, they won't be counted. However, is the
# simplest of the three to inspect by hand.
CountStat('active_users:is_bot:day',
sql_data_collector(RealmCount, count_user_by_realm_query(realm), (UserProfile, 'is_bot')),
CountStat.DAY, interval=TIMEDELTA_MAX),
CountStat('1day_actives::day',
sql_data_collector(UserCount, check_useractivityinterval_by_user_query, None),
CountStat.DAY, interval=timedelta(days=1)-UserActivityInterval.MIN_INTERVAL_LENGTH),
CountStat('15day_actives::day',
sql_data_collector(UserCount, check_useractivityinterval_by_user_query, None),
CountStat.DAY, interval=timedelta(days=15)-UserActivityInterval.MIN_INTERVAL_LENGTH),
CountStat('minutes_active::day', DataCollector(UserCount, do_pull_minutes_active), CountStat.DAY),
# Messages read stats. messages_read::hour is the total
# number of messages read, whereas
# messages_read_interactions::hour tries to count the total
# number of UI interactions resulting in messages being marked
# as read (imperfect because of batching of some request
# types, but less likely to be overwhelmed by a single bulk
# operation).
LoggingCountStat('messages_read::hour', UserCount, CountStat.HOUR),
LoggingCountStat('messages_read_interactions::hour', UserCount, CountStat.HOUR),
# Rate limiting stats
# User Activity stats
# Stats that measure user activity in the UserActivityInterval sense.
# Used to limit the number of invitation emails sent by a realm
LoggingCountStat('invites_sent::day', RealmCount, CountStat.DAY),
CountStat('1day_actives::day',
sql_data_collector(
UserCount, check_useractivityinterval_by_user_query(realm), None),
CountStat.DAY, interval=timedelta(days=1)-UserActivityInterval.MIN_INTERVAL_LENGTH),
CountStat('15day_actives::day',
sql_data_collector(
UserCount, check_useractivityinterval_by_user_query(realm), None),
CountStat.DAY, interval=timedelta(days=15)-UserActivityInterval.MIN_INTERVAL_LENGTH),
CountStat('minutes_active::day', DataCollector(
UserCount, do_pull_minutes_active), CountStat.DAY),
# Dependent stats
# Must come after their dependencies.
# Rate limiting stats
# Canonical account of the number of active humans in a realm on each day.
DependentCountStat('realm_active_humans::day',
sql_data_collector(RealmCount, count_realm_active_humans_query, None),
CountStat.DAY,
dependencies=['active_users_audit:is_bot:day', '15day_actives::day'])
]
# Used to limit the number of invitation emails sent by a realm
LoggingCountStat('invites_sent::day', RealmCount, CountStat.DAY),
# Dependent stats
# Must come after their dependencies.
# Canonical account of the number of active humans in a realm on each day.
DependentCountStat('realm_active_humans::day',
sql_data_collector(
RealmCount, count_realm_active_humans_query(realm), None),
CountStat.DAY,
dependencies=['active_users_audit:is_bot:day', '15day_actives::day']),
]
return OrderedDict([(stat.property, stat) for stat in count_stats_])
# To avoid refactoring for now COUNT_STATS can be used as before
COUNT_STATS = get_count_stats()
COUNT_STATS = OrderedDict([(stat.property, stat) for stat in count_stats_])

View File

@@ -4,7 +4,6 @@ from typing import List
from analytics.lib.counts import CountStat
def generate_time_series_data(days: int=100, business_hours_base: float=10,
non_business_hours_base: float=10, growth: float=1,
autocorrelation: float=0, spikiness: float=1,
@@ -44,10 +43,10 @@ def generate_time_series_data(days: int=100, business_hours_base: float=10,
[24*non_business_hours_base] * 2
holidays = [random() < holiday_rate for i in range(days)]
else:
raise AssertionError(f"Unknown frequency: {frequency}")
raise AssertionError("Unknown frequency: %s" % (frequency,))
if length < 2:
raise AssertionError("Must be generating at least 2 data points. "
f"Currently generating {length}")
"Currently generating %s" % (length,))
growth_base = growth ** (1. / (length-1))
values_no_noise = [seasonality[i % len(seasonality)] * (growth_base**i) for i in range(length)]

View File

@@ -4,7 +4,6 @@ from typing import List, Optional
from analytics.lib.counts import CountStat
from zerver.lib.timestamp import floor_to_day, floor_to_hour, verify_UTC
# If min_length is None, returns end_times from ceiling(start) to floor(end), inclusive.
# If min_length is greater than 0, pads the list to the left.
# So informally, time_range(Sep 20, Sep 22, day, None) returns [Sep 20, Sep 21, Sep 22],
@@ -20,7 +19,7 @@ def time_range(start: datetime, end: datetime, frequency: str,
end = floor_to_day(end)
step = timedelta(days=1)
else:
raise AssertionError(f"Unknown frequency: {frequency}")
raise AssertionError("Unknown frequency: %s" % (frequency,))
times = []
if min_length is not None:

View File

@@ -8,7 +8,6 @@ from django.core.management.base import BaseCommand, CommandParser
from zerver.lib.timestamp import timestamp_to_datetime
from zerver.models import Message, Recipient
def compute_stats(log_level: int) -> None:
logger = logging.getLogger()
logger.setLevel(log_level)
@@ -27,15 +26,15 @@ def compute_stats(log_level: int) -> None:
"bitcoin@mit.edu", "lp@mit.edu", "clocks@mit.edu",
"root@mit.edu", "nagios@mit.edu",
"www-data|local-realm@mit.edu"])
user_counts: Dict[str, Dict[str, int]] = {}
user_counts = {} # type: Dict[str, Dict[str, int]]
for m in mit_query.select_related("sending_client", "sender"):
email = m.sender.email
user_counts.setdefault(email, {})
user_counts[email].setdefault(m.sending_client.name, 0)
user_counts[email][m.sending_client.name] += 1
total_counts: Dict[str, int] = {}
total_user_counts: Dict[str, int] = {}
total_counts = {} # type: Dict[str, int]
total_user_counts = {} # type: Dict[str, int]
for email, counts in user_counts.items():
total_user_counts.setdefault(email, 0)
for client_name, count in counts.items():
@@ -43,8 +42,8 @@ def compute_stats(log_level: int) -> None:
total_counts[client_name] += count
total_user_counts[email] += count
logging.debug("%40s | %10s | %s", "User", "Messages", "Percentage Zulip")
top_percents: Dict[int, float] = {}
logging.debug("%40s | %10s | %s" % ("User", "Messages", "Percentage Zulip"))
top_percents = {} # type: Dict[int, float]
for size in [10, 25, 50, 100, 200, len(total_user_counts.keys())]:
top_percents[size] = 0.0
for i, email in enumerate(sorted(total_user_counts.keys(),
@@ -56,18 +55,18 @@ def compute_stats(log_level: int) -> None:
if i < size:
top_percents[size] += (percent_zulip * 1.0 / size)
logging.debug("%40s | %10s | %s%%", email, total_user_counts[email],
percent_zulip)
logging.debug("%40s | %10s | %s%%" % (email, total_user_counts[email],
percent_zulip))
logging.info("")
for size in sorted(top_percents.keys()):
logging.info("Top %6s | %s%%", size, round(top_percents[size], 1))
logging.info("Top %6s | %s%%" % (size, round(top_percents[size], 1)))
grand_total = sum(total_counts.values())
print(grand_total)
logging.info("%15s | %s", "Client", "Percentage")
logging.info("%15s | %s" % ("Client", "Percentage"))
for client in total_counts.keys():
logging.info("%15s | %s%%", client, round(100. * total_counts[client] / grand_total, 1))
logging.info("%15s | %s%%" % (client, round(100. * total_counts[client] / grand_total, 1)))
class Command(BaseCommand):
help = "Compute statistics on MIT Zephyr usage."

View File

@@ -2,13 +2,13 @@ import datetime
from typing import Any, Dict
from django.core.management.base import BaseCommand, CommandParser
from django.utils.timezone import utc
from zerver.lib.statistics import seconds_usage_between
from zerver.models import UserProfile
def analyze_activity(options: Dict[str, Any]) -> None:
day_start = datetime.datetime.strptime(options["date"], "%Y-%m-%d").replace(tzinfo=datetime.timezone.utc)
day_start = datetime.datetime.strptime(options["date"], "%Y-%m-%d").replace(tzinfo=utc)
day_end = day_start + datetime.timedelta(days=options["duration"])
user_profile_query = UserProfile.objects.all()
@@ -24,11 +24,11 @@ def analyze_activity(options: Dict[str, Any]) -> None:
continue
total_duration += duration
print(f"{user_profile.email:<37}{duration}")
print("%-*s%s" % (37, user_profile.email, duration,))
print(f"\nTotal Duration: {total_duration}")
print(f"\nTotal Duration in minutes: {total_duration.total_seconds() / 60.}")
print(f"Total Duration amortized to a month: {total_duration.total_seconds() * 30. / 60.}")
print("\nTotal Duration: %s" % (total_duration,))
print("\nTotal Duration in minutes: %s" % (total_duration.total_seconds() / 60.,))
print("Total Duration amortized to a month: %s" % (total_duration.total_seconds() * 30. / 60.,))
class Command(BaseCommand):
help = """Report analytics of user activity on a per-user and realm basis.

View File

@@ -1,21 +1,24 @@
import os
import time
from datetime import timedelta
from typing import Any, Dict
from django.core.management.base import BaseCommand
from django.utils.timezone import now as timezone_now
from analytics.models import installation_epoch, \
last_successful_fill
from analytics.lib.counts import COUNT_STATS, CountStat
from analytics.models import installation_epoch, last_successful_fill
from zerver.lib.timestamp import TimezoneNotUTCException, floor_to_day, floor_to_hour, verify_UTC
from zerver.lib.timestamp import floor_to_hour, floor_to_day, verify_UTC, \
TimezoneNotUTCException
from zerver.models import Realm
import os
import time
from typing import Any, Dict
states = {
0: "OK",
1: "WARNING",
2: "CRITICAL",
3: "UNKNOWN",
3: "UNKNOWN"
}
class Command(BaseCommand):
@@ -32,7 +35,8 @@ class Command(BaseCommand):
state_file_tmp = state_file_path + "-tmp"
with open(state_file_tmp, "w") as f:
f.write(f"{int(time.time())}|{status}|{states[status]}|{message}\n")
f.write("%s|%s|%s|%s\n" % (
int(time.time()), status, states[status], message))
os.rename(state_file_tmp, state_file_path)
def get_fill_state(self) -> Dict[str, Any]:
@@ -48,7 +52,7 @@ class Command(BaseCommand):
try:
verify_UTC(last_fill)
except TimezoneNotUTCException:
return {'status': 2, 'message': f'FillState not in UTC for {property}'}
return {'status': 2, 'message': 'FillState not in UTC for %s' % (property,)}
if stat.frequency == CountStat.DAY:
floor_function = floor_to_day
@@ -60,7 +64,8 @@ class Command(BaseCommand):
critical_threshold = timedelta(minutes=150)
if floor_function(last_fill) != last_fill:
return {'status': 2, 'message': f'FillState not on {stat.frequency} boundary for {property}'}
return {'status': 2, 'message': 'FillState not on %s boundary for %s' %
(stat.frequency, property)}
time_to_last_fill = timezone_now() - last_fill
if time_to_last_fill > critical_threshold:
@@ -71,16 +76,7 @@ class Command(BaseCommand):
if len(critical_unfilled_properties) == 0 and len(warning_unfilled_properties) == 0:
return {'status': 0, 'message': 'FillState looks fine.'}
if len(critical_unfilled_properties) == 0:
return {
'status': 1,
'message': 'Missed filling {} once.'.format(
', '.join(warning_unfilled_properties),
),
}
return {
'status': 2,
'message': 'Missed filling {} once. Missed filling {} at least twice.'.format(
', '.join(warning_unfilled_properties),
', '.join(critical_unfilled_properties),
),
}
return {'status': 1, 'message': 'Missed filling %s once.' %
(', '.join(warning_unfilled_properties),)}
return {'status': 2, 'message': 'Missed filling %s once. Missed filling %s at least twice.' %
(', '.join(warning_unfilled_properties), ', '.join(critical_unfilled_properties))}

View File

@@ -5,7 +5,6 @@ from django.core.management.base import BaseCommand, CommandError
from analytics.lib.counts import do_drop_all_analytics_tables
class Command(BaseCommand):
help = """Clear analytics tables."""

View File

@@ -5,7 +5,6 @@ from django.core.management.base import BaseCommand, CommandError
from analytics.lib.counts import COUNT_STATS, do_drop_single_stat
class Command(BaseCommand):
help = """Clear analytics tables."""
@@ -20,7 +19,7 @@ class Command(BaseCommand):
def handle(self, *args: Any, **options: Any) -> None:
property = options['property']
if property not in COUNT_STATS:
raise CommandError(f"Invalid property: {property}")
raise CommandError("Invalid property: %s" % (property,))
if not options['force']:
raise CommandError("No action taken. Use --force.")

View File

@@ -8,7 +8,6 @@ from django.utils.timezone import now as timezone_now
from zerver.lib.management import ZulipBaseCommand
from zerver.models import UserActivity
class Command(ZulipBaseCommand):
help = """Report rough client activity globally, for a realm, or for a user
@@ -54,7 +53,7 @@ Usage examples:
counts.sort()
for count in counts:
print(f"{count[1]:>25} {count[0]:15}")
print("%25s %15d" % (count[1], count[0]))
print("Total:", total)
def handle(self, *args: Any, **options: Optional[str]) -> None:

View File

@@ -1,26 +1,21 @@
from datetime import timedelta
from typing import Any, Dict, List, Mapping, Optional, Type
from unittest import mock
import mock
from django.core.management.base import BaseCommand
from django.utils.timezone import now as timezone_now
from analytics.lib.counts import COUNT_STATS, CountStat, do_drop_all_analytics_tables
from analytics.lib.counts import COUNT_STATS, \
CountStat, do_drop_all_analytics_tables
from analytics.lib.fixtures import generate_time_series_data
from analytics.lib.time_utils import time_range
from analytics.models import (
BaseCount,
FillState,
InstallationCount,
RealmCount,
StreamCount,
UserCount,
)
from zerver.lib.actions import STREAM_ASSIGNMENT_COLORS, do_change_user_role
from analytics.models import BaseCount, FillState, RealmCount, UserCount, \
StreamCount, InstallationCount
from zerver.lib.actions import do_change_is_admin, STREAM_ASSIGNMENT_COLORS
from zerver.lib.create_user import create_user
from zerver.lib.timestamp import floor_to_day
from zerver.models import Client, Realm, Recipient, Stream, Subscription, UserProfile
from zerver.models import Realm, Stream, Client, \
Recipient, Subscription
class Command(BaseCommand):
help = """Populates analytics tables with randomly generated data."""
@@ -61,14 +56,10 @@ class Command(BaseCommand):
realm = Realm.objects.create(
string_id='analytics', name='Analytics', date_created=installation_time)
with mock.patch("zerver.lib.create_user.timezone_now", return_value=installation_time):
shylock = create_user(
'shylock@analytics.ds',
'Shylock',
realm,
full_name='Shylock',
role=UserProfile.ROLE_REALM_ADMINISTRATOR
)
do_change_user_role(shylock, UserProfile.ROLE_REALM_ADMINISTRATOR, acting_user=None)
shylock = create_user('shylock@analytics.ds', 'Shylock', realm,
full_name='Shylock', short_name='shylock',
is_realm_admin=True)
do_change_is_admin(shylock, True)
stream = Stream.objects.create(
name='all', realm=realm, date_created=installation_time)
recipient = Recipient.objects.create(type_id=stream.id, type=Recipient.STREAM)
@@ -90,7 +81,7 @@ class Command(BaseCommand):
end_times = time_range(last_end_time, last_end_time, stat.frequency,
len(list(fixture_data.values())[0]))
if table == InstallationCount:
id_args: Dict[str, Any] = {}
id_args = {} # type: Dict[str, Any]
if table == RealmCount:
id_args = {'realm': realm}
if table == UserCount:
@@ -105,13 +96,13 @@ class Command(BaseCommand):
for end_time, value in zip(end_times, values) if value != 0])
stat = COUNT_STATS['1day_actives::day']
realm_data: Mapping[Optional[str], List[int]] = {
realm_data = {
None: self.generate_fixture_data(stat, .08, .02, 3, .3, 6, partial_sum=True),
}
} # type: Mapping[Optional[str], List[int]]
insert_fixture_data(stat, realm_data, RealmCount)
installation_data: Mapping[Optional[str], List[int]] = {
installation_data = {
None: self.generate_fixture_data(stat, .8, .2, 4, .3, 6, partial_sum=True),
}
} # type: Mapping[Optional[str], List[int]]
insert_fixture_data(stat, installation_data, InstallationCount)
FillState.objects.create(property=stat.property, end_time=last_end_time,
state=FillState.DONE)
@@ -141,9 +132,8 @@ class Command(BaseCommand):
state=FillState.DONE)
stat = COUNT_STATS['messages_sent:is_bot:hour']
user_data: Mapping[Optional[str], List[int]] = {
'false': self.generate_fixture_data(stat, 2, 1, 1.5, .6, 8, holiday_rate=.1),
}
user_data = {'false': self.generate_fixture_data(
stat, 2, 1, 1.5, .6, 8, holiday_rate=.1)} # type: Mapping[Optional[str], List[int]]
insert_fixture_data(stat, user_data, UserCount)
realm_data = {'false': self.generate_fixture_data(stat, 35, 15, 6, .6, 4),
'true': self.generate_fixture_data(stat, 15, 15, 3, .4, 2)}
@@ -219,22 +209,8 @@ class Command(BaseCommand):
realm_data = {'false': self.generate_fixture_data(stat, 30, 5, 6, .6, 4),
'true': self.generate_fixture_data(stat, 20, 2, 3, .2, 3)}
insert_fixture_data(stat, realm_data, RealmCount)
stream_data: Mapping[Optional[str], List[int]] = {
'false': self.generate_fixture_data(stat, 10, 7, 5, .6, 4),
'true': self.generate_fixture_data(stat, 5, 3, 2, .4, 2),
}
stream_data = {'false': self.generate_fixture_data(stat, 10, 7, 5, .6, 4),
'true': self.generate_fixture_data(stat, 5, 3, 2, .4, 2)} # type: Mapping[Optional[str], List[int]]
insert_fixture_data(stat, stream_data, StreamCount)
FillState.objects.create(property=stat.property, end_time=last_end_time,
state=FillState.DONE)
stat = COUNT_STATS['messages_read::hour']
user_data = {
None: self.generate_fixture_data(stat, 7, 3, 2, .6, 8, holiday_rate=.1),
}
insert_fixture_data(stat, user_data, UserCount)
realm_data = {
None: self.generate_fixture_data(stat, 50, 35, 6, .6, 4)
}
insert_fixture_data(stat, realm_data, RealmCount)
FillState.objects.create(property=stat.property, end_time=last_end_time,
state=FillState.DONE)

View File

@@ -6,17 +6,8 @@ from django.core.management.base import BaseCommand, CommandError
from django.db.models import Count
from django.utils.timezone import now as timezone_now
from zerver.models import (
Message,
Realm,
Recipient,
Stream,
Subscription,
UserActivity,
UserMessage,
UserProfile,
get_realm,
)
from zerver.models import Message, Realm, Recipient, Stream, \
Subscription, UserActivity, UserMessage, UserProfile, get_realm
MOBILE_CLIENT_LIST = ["Android", "ios"]
HUMAN_CLIENT_LIST = MOBILE_CLIENT_LIST + ["website"]
@@ -75,7 +66,7 @@ class Command(BaseCommand):
fraction = 0.0
else:
fraction = numerator / float(denominator)
print(f"{fraction * 100:.2f}% of", text)
print("%.2f%% of" % (fraction * 100,), text)
def handle(self, *args: Any, **options: Any) -> None:
if options['realms']:
@@ -93,26 +84,26 @@ class Command(BaseCommand):
active_users = self.active_users(realm)
num_active = len(active_users)
print(f"{num_active} active users ({len(user_profiles)} total)")
print("%d active users (%d total)" % (num_active, len(user_profiles)))
streams = Stream.objects.filter(realm=realm).extra(
tables=['zerver_subscription', 'zerver_recipient'],
where=['zerver_subscription.recipient_id = zerver_recipient.id',
'zerver_recipient.type = 2',
'zerver_recipient.type_id = zerver_stream.id',
'zerver_subscription.active = true']).annotate(count=Count("name"))
print(f"{streams.count()} streams")
print("%d streams" % (streams.count(),))
for days_ago in (1, 7, 30):
print(f"In last {days_ago} days, users sent:")
print("In last %d days, users sent:" % (days_ago,))
sender_quantities = [self.messages_sent_by(user, days_ago) for user in user_profiles]
for quantity in sorted(sender_quantities, reverse=True):
print(quantity, end=' ')
print("")
print(f"{self.stream_messages(realm, days_ago)} stream messages")
print(f"{self.private_messages(realm, days_ago)} one-on-one private messages")
print(f"{self.api_messages(realm, days_ago)} messages sent via the API")
print(f"{self.group_private_messages(realm, days_ago)} group private messages")
print("%d stream messages" % (self.stream_messages(realm, days_ago),))
print("%d one-on-one private messages" % (self.private_messages(realm, days_ago),))
print("%d messages sent via the API" % (self.api_messages(realm, days_ago),))
print("%d group private messages" % (self.group_private_messages(realm, days_ago),))
num_notifications_enabled = len([x for x in active_users if x.enable_desktop_notifications])
self.report_percentage(num_notifications_enabled, num_active,
@@ -132,7 +123,7 @@ class Command(BaseCommand):
starrers = UserMessage.objects.filter(user_profile__in=user_profiles,
flags=UserMessage.flags.starred).values(
"user_profile").annotate(count=Count("user_profile"))
print("{} users have starred {} messages".format(
print("%d users have starred %d messages" % (
len(starrers), sum([elt["count"] for elt in starrers])))
active_user_subs = Subscription.objects.filter(
@@ -141,20 +132,20 @@ class Command(BaseCommand):
# Streams not in home view
non_home_view = active_user_subs.filter(is_muted=True).values(
"user_profile").annotate(count=Count("user_profile"))
print("{} users have {} streams not in home view".format(
print("%d users have %d streams not in home view" % (
len(non_home_view), sum([elt["count"] for elt in non_home_view])))
# Code block markup
markup_messages = human_messages.filter(
sender__realm=realm, content__contains="~~~").values(
"sender").annotate(count=Count("sender"))
print("{} users have used code block markup on {} messages".format(
print("%d users have used code block markup on %s messages" % (
len(markup_messages), sum([elt["count"] for elt in markup_messages])))
# Notifications for stream messages
notifications = active_user_subs.filter(desktop_notifications=True).values(
"user_profile").annotate(count=Count("user_profile"))
print("{} users receive desktop notifications for {} streams".format(
print("%d users receive desktop notifications for %d streams" % (
len(notifications), sum([elt["count"] for elt in notifications])))
print("")

View File

@@ -4,8 +4,8 @@ from typing import Any
from django.core.management.base import BaseCommand, CommandError
from django.db.models import Q
from zerver.models import Message, Realm, Recipient, Stream, Subscription, get_realm
from zerver.models import Message, Realm, \
Recipient, Stream, Subscription, get_realm
class Command(BaseCommand):
help = "Generate statistics on the streams for a realm."
@@ -36,21 +36,21 @@ class Command(BaseCommand):
public_count += 1
print("------------")
print(realm.string_id, end=' ')
print("{:>10} {} public streams and".format("(", public_count), end=' ')
print(f"{private_count} private streams )")
print("%10s %d public streams and" % ("(", public_count), end=' ')
print("%d private streams )" % (private_count,))
print("------------")
print("{:>25} {:>15} {:>10} {:>12}".format("stream", "subscribers", "messages", "type"))
print("%25s %15s %10s %12s" % ("stream", "subscribers", "messages", "type"))
for stream in streams:
if stream.invite_only:
stream_type = 'private'
else:
stream_type = 'public'
print(f"{stream.name:>25}", end=' ')
print("%25s" % (stream.name,), end=' ')
recipient = Recipient.objects.filter(type=Recipient.STREAM, type_id=stream.id)
print("{:10}".format(len(Subscription.objects.filter(recipient=recipient,
active=True))), end=' ')
print("%10d" % (len(Subscription.objects.filter(recipient=recipient,
active=True)),), end=' ')
num_messages = len(Message.objects.filter(recipient=recipient))
print(f"{num_messages:12}", end=' ')
print(f"{stream_type:>15}")
print("%12d" % (num_messages,), end=' ')
print("%15s" % (stream_type,))
print("")

View File

@@ -1,13 +1,13 @@
import os
import time
from argparse import ArgumentParser
from datetime import timezone
from typing import Any, Dict
from django.conf import settings
from django.core.management.base import BaseCommand
from django.utils.dateparse import parse_datetime
from django.utils.timezone import now as timezone_now
from django.utils.timezone import utc as timezone_utc
from analytics.lib.counts import COUNT_STATS, logger, process_count_stat
from scripts.lib.zulip_tools import ENDC, WARNING
@@ -15,7 +15,6 @@ from zerver.lib.remote_server import send_analytics_to_remote_server
from zerver.lib.timestamp import floor_to_hour
from zerver.models import Realm
class Command(BaseCommand):
help = """Fills Analytics tables.
@@ -60,18 +59,18 @@ class Command(BaseCommand):
fill_to_time = parse_datetime(options['time'])
if options['utc']:
fill_to_time = fill_to_time.replace(tzinfo=timezone.utc)
fill_to_time = fill_to_time.replace(tzinfo=timezone_utc)
if fill_to_time.tzinfo is None:
raise ValueError("--time must be timezone aware. Maybe you meant to use the --utc option?")
fill_to_time = floor_to_hour(fill_to_time.astimezone(timezone.utc))
fill_to_time = floor_to_hour(fill_to_time.astimezone(timezone_utc))
if options['stat'] is not None:
stats = [COUNT_STATS[options['stat']]]
else:
stats = list(COUNT_STATS.values())
logger.info("Starting updating analytics counts through %s", fill_to_time)
logger.info("Starting updating analytics counts through %s" % (fill_to_time,))
if options['verbose']:
start = time.time()
last = start
@@ -79,12 +78,13 @@ class Command(BaseCommand):
for stat in stats:
process_count_stat(stat, fill_to_time)
if options['verbose']:
print(f"Updated {stat.property} in {time.time() - last:.3f}s")
print("Updated %s in %.3fs" % (stat.property, time.time() - last))
last = time.time()
if options['verbose']:
print(f"Finished updating analytics counts through {fill_to_time} in {time.time() - start:.3f}s")
logger.info("Finished updating analytics counts through %s", fill_to_time)
print("Finished updating analytics counts through %s in %.3fs" %
(fill_to_time, time.time() - start))
logger.info("Finished updating analytics counts through %s" % (fill_to_time,))
if settings.PUSH_NOTIFICATION_BOUNCER_URL and settings.SUBMIT_USAGE_STATISTICS:
send_analytics_to_remote_server()

View File

@@ -7,7 +7,6 @@ from django.utils.timezone import now as timezone_now
from zerver.models import Message, Realm, Stream, UserProfile, get_realm
class Command(BaseCommand):
help = "Generate statistics on user activity."
@@ -32,11 +31,11 @@ class Command(BaseCommand):
for realm in realms:
print(realm.string_id)
user_profiles = UserProfile.objects.filter(realm=realm, is_active=True)
print(f"{len(user_profiles)} users")
print(f"{len(Stream.objects.filter(realm=realm))} streams")
print("%d users" % (len(user_profiles),))
print("%d streams" % (len(Stream.objects.filter(realm=realm)),))
for user_profile in user_profiles:
print(f"{user_profile.email:>35}", end=' ')
print("%35s" % (user_profile.email,), end=' ')
for week in range(10):
print(f"{self.messages_sent_by(user_profile, week):5}", end=' ')
print("%5d" % (self.messages_sent_by(user_profile, week),), end=' ')
print("")

View File

@@ -1,8 +1,8 @@
# -*- coding: utf-8 -*-
import django.db.models.deletion
from django.conf import settings
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
@@ -89,22 +89,22 @@ class Migration(migrations.Migration):
),
migrations.AlterUniqueTogether(
name='usercount',
unique_together={('user', 'property', 'end_time', 'interval')},
unique_together=set([('user', 'property', 'end_time', 'interval')]),
),
migrations.AlterUniqueTogether(
name='streamcount',
unique_together={('stream', 'property', 'end_time', 'interval')},
unique_together=set([('stream', 'property', 'end_time', 'interval')]),
),
migrations.AlterUniqueTogether(
name='realmcount',
unique_together={('realm', 'property', 'end_time', 'interval')},
unique_together=set([('realm', 'property', 'end_time', 'interval')]),
),
migrations.AlterUniqueTogether(
name='installationcount',
unique_together={('property', 'end_time', 'interval')},
unique_together=set([('property', 'end_time', 'interval')]),
),
migrations.AlterUniqueTogether(
name='huddlecount',
unique_together={('huddle', 'property', 'end_time', 'interval')},
unique_together=set([('huddle', 'property', 'end_time', 'interval')]),
),
]

View File

@@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
@@ -10,7 +10,7 @@ class Migration(migrations.Migration):
operations = [
migrations.AlterUniqueTogether(
name='huddlecount',
unique_together=set(),
unique_together=set([]),
),
migrations.RemoveField(
model_name='huddlecount',

View File

@@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [

View File

@@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [

View File

@@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [

View File

@@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
@@ -10,18 +10,18 @@ class Migration(migrations.Migration):
operations = [
migrations.AlterUniqueTogether(
name='installationcount',
unique_together={('property', 'subgroup', 'end_time', 'interval')},
unique_together=set([('property', 'subgroup', 'end_time', 'interval')]),
),
migrations.AlterUniqueTogether(
name='realmcount',
unique_together={('realm', 'property', 'subgroup', 'end_time', 'interval')},
unique_together=set([('realm', 'property', 'subgroup', 'end_time', 'interval')]),
),
migrations.AlterUniqueTogether(
name='streamcount',
unique_together={('stream', 'property', 'subgroup', 'end_time', 'interval')},
unique_together=set([('stream', 'property', 'subgroup', 'end_time', 'interval')]),
),
migrations.AlterUniqueTogether(
name='usercount',
unique_together={('user', 'property', 'subgroup', 'end_time', 'interval')},
unique_together=set([('user', 'property', 'subgroup', 'end_time', 'interval')]),
),
]

View File

@@ -1,7 +1,7 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.10.4 on 2017-01-16 20:50
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
@@ -11,7 +11,7 @@ class Migration(migrations.Migration):
operations = [
migrations.AlterUniqueTogether(
name='installationcount',
unique_together={('property', 'subgroup', 'end_time')},
unique_together=set([('property', 'subgroup', 'end_time')]),
),
migrations.RemoveField(
model_name='installationcount',
@@ -19,7 +19,7 @@ class Migration(migrations.Migration):
),
migrations.AlterUniqueTogether(
name='realmcount',
unique_together={('realm', 'property', 'subgroup', 'end_time')},
unique_together=set([('realm', 'property', 'subgroup', 'end_time')]),
),
migrations.RemoveField(
model_name='realmcount',
@@ -27,7 +27,7 @@ class Migration(migrations.Migration):
),
migrations.AlterUniqueTogether(
name='streamcount',
unique_together={('stream', 'property', 'subgroup', 'end_time')},
unique_together=set([('stream', 'property', 'subgroup', 'end_time')]),
),
migrations.RemoveField(
model_name='streamcount',
@@ -35,7 +35,7 @@ class Migration(migrations.Migration):
),
migrations.AlterUniqueTogether(
name='usercount',
unique_together={('user', 'property', 'subgroup', 'end_time')},
unique_together=set([('user', 'property', 'subgroup', 'end_time')]),
),
migrations.RemoveField(
model_name='usercount',

View File

@@ -1,7 +1,7 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.10.5 on 2017-02-01 22:28
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
@@ -12,14 +12,14 @@ class Migration(migrations.Migration):
operations = [
migrations.AlterIndexTogether(
name='realmcount',
index_together={('property', 'end_time')},
index_together=set([('property', 'end_time')]),
),
migrations.AlterIndexTogether(
name='streamcount',
index_together={('property', 'realm', 'end_time')},
index_together=set([('property', 'realm', 'end_time')]),
),
migrations.AlterIndexTogether(
name='usercount',
index_together={('property', 'realm', 'end_time')},
index_together=set([('property', 'realm', 'end_time')]),
),
]

View File

@@ -1,8 +1,8 @@
# -*- coding: utf-8 -*-
from django.db import migrations
from django.db.backends.postgresql.schema import DatabaseSchemaEditor
from django.db.backends.postgresql_psycopg2.schema import DatabaseSchemaEditor
from django.db.migrations.state import StateApps
def delete_messages_sent_to_stream_stat(apps: StateApps, schema_editor: DatabaseSchemaEditor) -> None:
UserCount = apps.get_model('analytics', 'UserCount')
StreamCount = apps.get_model('analytics', 'StreamCount')

View File

@@ -1,8 +1,8 @@
# -*- coding: utf-8 -*-
from django.db import migrations
from django.db.backends.postgresql.schema import DatabaseSchemaEditor
from django.db.backends.postgresql_psycopg2.schema import DatabaseSchemaEditor
from django.db.migrations.state import StateApps
def clear_message_sent_by_message_type_values(apps: StateApps, schema_editor: DatabaseSchemaEditor) -> None:
UserCount = apps.get_model('analytics', 'UserCount')
StreamCount = apps.get_model('analytics', 'StreamCount')

View File

@@ -1,8 +1,8 @@
# -*- coding: utf-8 -*-
from django.db import migrations
from django.db.backends.postgresql.schema import DatabaseSchemaEditor
from django.db.backends.postgresql_psycopg2.schema import DatabaseSchemaEditor
from django.db.migrations.state import StateApps
def clear_analytics_tables(apps: StateApps, schema_editor: DatabaseSchemaEditor) -> None:
UserCount = apps.get_model('analytics', 'UserCount')
StreamCount = apps.get_model('analytics', 'StreamCount')

View File

@@ -1,7 +1,9 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.6 on 2018-01-29 08:14
from __future__ import unicode_literals
import django.db.models.deletion
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):

View File

@@ -1,4 +1,6 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.18 on 2019-02-02 02:47
from __future__ import unicode_literals
from django.db import migrations

View File

@@ -1,17 +0,0 @@
# Generated by Django 1.11.26 on 2020-01-27 04:32
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('analytics', '0013_remove_anomaly'),
]
operations = [
migrations.RemoveField(
model_name='fillstate',
name='last_modified',
),
]

View File

@@ -1,59 +0,0 @@
from django.db import migrations
from django.db.backends.postgresql.schema import DatabaseSchemaEditor
from django.db.migrations.state import StateApps
from django.db.models import Count, Sum
def clear_duplicate_counts(apps: StateApps, schema_editor: DatabaseSchemaEditor) -> None:
"""This is a preparatory migration for our Analytics tables.
The backstory is that Django's unique_together indexes do not properly
handle the subgroup=None corner case (allowing duplicate rows that have a
subgroup of None), which meant that in race conditions, rather than updating
an existing row for the property/(realm, stream, user)/time with subgroup=None, Django would
create a duplicate row.
In the next migration, we'll add a proper constraint to fix this bug, but
we need to fix any existing problematic rows before we can add that constraint.
We fix this in an appropriate fashion for each type of CountStat object; mainly
this means deleting the extra rows, but for LoggingCountStat objects, we need to
additionally combine the sums.
"""
count_tables = dict(realm=apps.get_model('analytics', 'RealmCount'),
user=apps.get_model('analytics', 'UserCount'),
stream=apps.get_model('analytics', 'StreamCount'),
installation=apps.get_model('analytics', 'InstallationCount'))
for name, count_table in count_tables.items():
value = [name, 'property', 'end_time']
if name == 'installation':
value = ['property', 'end_time']
counts = count_table.objects.filter(subgroup=None).values(*value).annotate(
Count('id'), Sum('value')).filter(id__count__gt=1)
for count in counts:
count.pop('id__count')
total_value = count.pop('value__sum')
duplicate_counts = list(count_table.objects.filter(**count))
first_count = duplicate_counts[0]
if count['property'] in ["invites_sent::day", "active_users_log:is_bot:day"]:
# For LoggingCountStat objects, the right fix is to combine the totals;
# for other CountStat objects, we expect the duplicates to have the same value.
# And so all we need to do is delete them.
first_count.value = total_value
first_count.save()
to_cleanup = duplicate_counts[1:]
for duplicate_count in to_cleanup:
duplicate_count.delete()
class Migration(migrations.Migration):
dependencies = [
('analytics', '0014_remove_fillstate_last_modified'),
]
operations = [
migrations.RunPython(clear_duplicate_counts,
reverse_code=migrations.RunPython.noop),
]

View File

@@ -1,61 +0,0 @@
# Generated by Django 2.2.10 on 2020-02-29 19:40
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('analytics', '0015_clear_duplicate_counts'),
]
operations = [
migrations.AlterUniqueTogether(
name='installationcount',
unique_together=set(),
),
migrations.AlterUniqueTogether(
name='realmcount',
unique_together=set(),
),
migrations.AlterUniqueTogether(
name='streamcount',
unique_together=set(),
),
migrations.AlterUniqueTogether(
name='usercount',
unique_together=set(),
),
migrations.AddConstraint(
model_name='installationcount',
constraint=models.UniqueConstraint(condition=models.Q(subgroup__isnull=False), fields=('property', 'subgroup', 'end_time'), name='unique_installation_count'),
),
migrations.AddConstraint(
model_name='installationcount',
constraint=models.UniqueConstraint(condition=models.Q(subgroup__isnull=True), fields=('property', 'end_time'), name='unique_installation_count_null_subgroup'),
),
migrations.AddConstraint(
model_name='realmcount',
constraint=models.UniqueConstraint(condition=models.Q(subgroup__isnull=False), fields=('realm', 'property', 'subgroup', 'end_time'), name='unique_realm_count'),
),
migrations.AddConstraint(
model_name='realmcount',
constraint=models.UniqueConstraint(condition=models.Q(subgroup__isnull=True), fields=('realm', 'property', 'end_time'), name='unique_realm_count_null_subgroup'),
),
migrations.AddConstraint(
model_name='streamcount',
constraint=models.UniqueConstraint(condition=models.Q(subgroup__isnull=False), fields=('stream', 'property', 'subgroup', 'end_time'), name='unique_stream_count'),
),
migrations.AddConstraint(
model_name='streamcount',
constraint=models.UniqueConstraint(condition=models.Q(subgroup__isnull=True), fields=('stream', 'property', 'end_time'), name='unique_stream_count_null_subgroup'),
),
migrations.AddConstraint(
model_name='usercount',
constraint=models.UniqueConstraint(condition=models.Q(subgroup__isnull=False), fields=('user', 'property', 'subgroup', 'end_time'), name='unique_user_count'),
),
migrations.AddConstraint(
model_name='usercount',
constraint=models.UniqueConstraint(condition=models.Q(subgroup__isnull=True), fields=('user', 'property', 'end_time'), name='unique_user_count_null_subgroup'),
),
]

View File

@@ -2,23 +2,23 @@ import datetime
from typing import Optional
from django.db import models
from django.db.models import Q, UniqueConstraint
from zerver.lib.timestamp import floor_to_day
from zerver.models import Realm, Stream, UserProfile
class FillState(models.Model):
property: str = models.CharField(max_length=40, unique=True)
end_time: datetime.datetime = models.DateTimeField()
property = models.CharField(max_length=40, unique=True) # type: str
end_time = models.DateTimeField() # type: datetime.datetime
# Valid states are {DONE, STARTED}
DONE = 1
STARTED = 2
state: int = models.PositiveSmallIntegerField()
state = models.PositiveSmallIntegerField() # type: int
last_modified = models.DateTimeField(auto_now=True) # type: datetime.datetime
def __str__(self) -> str:
return f"<FillState: {self.property} {self.end_time} {self.state}>"
return "<FillState: %s %s %s>" % (self.property, self.end_time, self.state)
# The earliest/starting end_time in FillState
# We assume there is at least one realm
@@ -38,10 +38,10 @@ class BaseCount(models.Model):
# Note: When inheriting from BaseCount, you may want to rearrange
# the order of the columns in the migration to make sure they
# match how you'd like the table to be arranged.
property: str = models.CharField(max_length=32)
subgroup: Optional[str] = models.CharField(max_length=16, null=True)
end_time: datetime.datetime = models.DateTimeField()
value: int = models.BigIntegerField()
property = models.CharField(max_length=32) # type: str
subgroup = models.CharField(max_length=16, null=True) # type: Optional[str]
end_time = models.DateTimeField() # type: datetime.datetime
value = models.BigIntegerField() # type: int
class Meta:
abstract = True
@@ -49,83 +49,44 @@ class BaseCount(models.Model):
class InstallationCount(BaseCount):
class Meta:
# Handles invalid duplicate InstallationCount data
constraints = [
UniqueConstraint(
fields=["property", "subgroup", "end_time"],
condition=Q(subgroup__isnull=False),
name='unique_installation_count'),
UniqueConstraint(
fields=["property", "end_time"],
condition=Q(subgroup__isnull=True),
name='unique_installation_count_null_subgroup'),
]
unique_together = ("property", "subgroup", "end_time")
def __str__(self) -> str:
return f"<InstallationCount: {self.property} {self.subgroup} {self.value}>"
return "<InstallationCount: %s %s %s>" % (self.property, self.subgroup, self.value)
class RealmCount(BaseCount):
realm = models.ForeignKey(Realm, on_delete=models.CASCADE)
class Meta:
# Handles invalid duplicate RealmCount data
constraints = [
UniqueConstraint(
fields=["realm", "property", "subgroup", "end_time"],
condition=Q(subgroup__isnull=False),
name='unique_realm_count'),
UniqueConstraint(
fields=["realm", "property", "end_time"],
condition=Q(subgroup__isnull=True),
name='unique_realm_count_null_subgroup'),
]
unique_together = ("realm", "property", "subgroup", "end_time")
index_together = ["property", "end_time"]
def __str__(self) -> str:
return f"<RealmCount: {self.realm} {self.property} {self.subgroup} {self.value}>"
return "<RealmCount: %s %s %s %s>" % (self.realm, self.property, self.subgroup, self.value)
class UserCount(BaseCount):
user = models.ForeignKey(UserProfile, on_delete=models.CASCADE)
realm = models.ForeignKey(Realm, on_delete=models.CASCADE)
class Meta:
# Handles invalid duplicate UserCount data
constraints = [
UniqueConstraint(
fields=["user", "property", "subgroup", "end_time"],
condition=Q(subgroup__isnull=False),
name='unique_user_count'),
UniqueConstraint(
fields=["user", "property", "end_time"],
condition=Q(subgroup__isnull=True),
name='unique_user_count_null_subgroup'),
]
unique_together = ("user", "property", "subgroup", "end_time")
# This index dramatically improves the performance of
# aggregating from users to realms
index_together = ["property", "realm", "end_time"]
def __str__(self) -> str:
return f"<UserCount: {self.user} {self.property} {self.subgroup} {self.value}>"
return "<UserCount: %s %s %s %s>" % (self.user, self.property, self.subgroup, self.value)
class StreamCount(BaseCount):
stream = models.ForeignKey(Stream, on_delete=models.CASCADE)
realm = models.ForeignKey(Realm, on_delete=models.CASCADE)
class Meta:
# Handles invalid duplicate StreamCount data
constraints = [
UniqueConstraint(
fields=["stream", "property", "subgroup", "end_time"],
condition=Q(subgroup__isnull=False),
name='unique_stream_count'),
UniqueConstraint(
fields=["stream", "property", "end_time"],
condition=Q(subgroup__isnull=True),
name='unique_stream_count_null_subgroup'),
]
unique_together = ("stream", "property", "subgroup", "end_time")
# This index dramatically improves the performance of
# aggregating from streams to realms
index_together = ["property", "realm", "end_time"]
def __str__(self) -> str:
return f"<StreamCount: {self.stream} {self.property} {self.subgroup} {self.value} {self.id}>"
return "<StreamCount: %s %s %s %s %s>" % (
self.stream, self.property, self.subgroup, self.value, self.id)

View File

@@ -1,76 +1,39 @@
from datetime import datetime, timedelta, timezone
from datetime import datetime, timedelta
from typing import Any, Dict, List, Optional, Tuple, Type
from unittest import mock
import mock
import ujson
from django.apps import apps
from django.db import models
from django.db.models import Sum
from django.test import TestCase
from django.utils.timezone import now as timezone_now
from psycopg2.sql import SQL, Literal
from django.utils.timezone import utc as timezone_utc
from analytics.lib.counts import (
COUNT_STATS,
CountStat,
DependentCountStat,
LoggingCountStat,
do_aggregate_to_summary_table,
do_drop_all_analytics_tables,
do_drop_single_stat,
do_fill_count_stat_at_hour,
do_increment_logging_stat,
get_count_stats,
process_count_stat,
sql_data_collector,
)
from analytics.models import (
BaseCount,
FillState,
InstallationCount,
RealmCount,
StreamCount,
UserCount,
installation_epoch,
)
from zerver.lib.actions import (
InvitationError,
do_activate_user,
do_create_user,
do_deactivate_user,
do_invite_users,
do_mark_all_as_read,
do_mark_stream_messages_as_read,
do_reactivate_user,
do_resend_user_invite_email,
do_revoke_user_invite,
do_update_message_flags,
update_user_activity_interval,
)
from analytics.lib.counts import COUNT_STATS, CountStat, \
DependentCountStat, LoggingCountStat, do_aggregate_to_summary_table, \
do_drop_all_analytics_tables, do_drop_single_stat, \
do_fill_count_stat_at_hour, do_increment_logging_stat, \
process_count_stat, sql_data_collector
from analytics.models import BaseCount, \
FillState, InstallationCount, RealmCount, StreamCount, \
UserCount, installation_epoch
from zerver.lib.actions import do_activate_user, do_create_user, \
do_deactivate_user, do_reactivate_user, update_user_activity_interval, \
do_invite_users, do_revoke_user_invite, do_resend_user_invite_email, \
InvitationError
from zerver.lib.create_user import create_user
from zerver.lib.test_classes import ZulipTestCase
from zerver.lib.timestamp import TimezoneNotUTCException, floor_to_day
from zerver.lib.topic import DB_TOPIC_NAME
from zerver.models import (
Client,
Huddle,
Message,
PreregistrationUser,
Realm,
RealmAuditLog,
Recipient,
Stream,
UserActivityInterval,
UserProfile,
get_client,
get_user,
)
from zerver.models import Client, Huddle, Message, Realm, \
RealmAuditLog, Recipient, Stream, UserActivityInterval, \
UserProfile, get_client, get_user, PreregistrationUser
class AnalyticsTestCase(ZulipTestCase):
class AnalyticsTestCase(TestCase):
MINUTE = timedelta(seconds = 60)
HOUR = MINUTE * 60
DAY = HOUR * 24
TIME_ZERO = datetime(1988, 3, 14, tzinfo=timezone.utc)
TIME_ZERO = datetime(1988, 3, 14).replace(tzinfo=timezone_utc)
TIME_LAST_HOUR = TIME_ZERO - HOUR
def setUp(self) -> None:
@@ -80,15 +43,16 @@ class AnalyticsTestCase(ZulipTestCase):
# used to generate unique names in self.create_*
self.name_counter = 100
# used as defaults in self.assertCountEquals
self.current_property: Optional[str] = None
self.current_property = None # type: Optional[str]
# Lightweight creation of users, streams, and messages
def create_user(self, **kwargs: Any) -> UserProfile:
self.name_counter += 1
defaults = {
'email': f'user{self.name_counter}@domain.tld',
'email': 'user%s@domain.tld' % (self.name_counter,),
'date_joined': self.TIME_LAST_HOUR,
'full_name': 'full_name',
'short_name': 'short_name',
'is_active': True,
'is_bot': False,
'realm': self.default_realm}
@@ -96,42 +60,33 @@ class AnalyticsTestCase(ZulipTestCase):
kwargs[key] = kwargs.get(key, value)
kwargs['delivery_email'] = kwargs['email']
with mock.patch("zerver.lib.create_user.timezone_now", return_value=kwargs['date_joined']):
pass_kwargs: Dict[str, Any] = {}
pass_kwargs = {} # type: Dict[str, Any]
if kwargs['is_bot']:
pass_kwargs['bot_type'] = UserProfile.DEFAULT_BOT
pass_kwargs['bot_owner'] = None
return create_user(
kwargs['email'],
'password',
kwargs['realm'],
active=kwargs['is_active'],
full_name=kwargs['full_name'],
role=UserProfile.ROLE_REALM_ADMINISTRATOR,
**pass_kwargs
)
return create_user(kwargs['email'], 'password', kwargs['realm'],
active=kwargs['is_active'],
full_name=kwargs['full_name'], short_name=kwargs['short_name'],
is_realm_admin=True, **pass_kwargs)
def create_stream_with_recipient(self, **kwargs: Any) -> Tuple[Stream, Recipient]:
self.name_counter += 1
defaults = {'name': f'stream name {self.name_counter}',
defaults = {'name': 'stream name %s' % (self.name_counter,),
'realm': self.default_realm,
'date_created': self.TIME_LAST_HOUR}
for key, value in defaults.items():
kwargs[key] = kwargs.get(key, value)
stream = Stream.objects.create(**kwargs)
recipient = Recipient.objects.create(type_id=stream.id, type=Recipient.STREAM)
stream.recipient = recipient
stream.save(update_fields=["recipient"])
return stream, recipient
def create_huddle_with_recipient(self, **kwargs: Any) -> Tuple[Huddle, Recipient]:
self.name_counter += 1
defaults = {'huddle_hash': f'hash{self.name_counter}'}
defaults = {'huddle_hash': 'hash%s' % (self.name_counter,)}
for key, value in defaults.items():
kwargs[key] = kwargs.get(key, value)
huddle = Huddle.objects.create(**kwargs)
recipient = Recipient.objects.create(type_id=huddle.id, type=Recipient.HUDDLE)
huddle.recipient = recipient
huddle.save(update_fields=["recipient"])
return huddle, recipient
def create_message(self, sender: UserProfile, recipient: Recipient, **kwargs: Any) -> Message:
@@ -190,7 +145,7 @@ class AnalyticsTestCase(ZulipTestCase):
'end_time': self.TIME_ZERO,
'value': 1}
for values in arg_values:
kwargs: Dict[str, Any] = {}
kwargs = {} # type: Dict[str, Any]
for i in range(len(values)):
kwargs[arg_keys[i]] = values[i]
for key, value in defaults.items():
@@ -208,13 +163,8 @@ class AnalyticsTestCase(ZulipTestCase):
class TestProcessCountStat(AnalyticsTestCase):
def make_dummy_count_stat(self, property: str) -> CountStat:
query = lambda kwargs: SQL("""
INSERT INTO analytics_realmcount (realm_id, value, property, end_time)
VALUES ({default_realm_id}, 1, {property}, %(time_end)s)
""").format(
default_realm_id=Literal(self.default_realm.id),
property=Literal(property),
)
query = """INSERT INTO analytics_realmcount (realm_id, value, property, end_time)
VALUES (%s, 1, '%s', %%%%(time_end)s)""" % (self.default_realm.id, property)
return CountStat(property, sql_data_collector(RealmCount, query, None), CountStat.HOUR)
def assertFillStateEquals(self, stat: CountStat, end_time: datetime,
@@ -312,13 +262,8 @@ class TestProcessCountStat(AnalyticsTestCase):
def test_process_dependent_stat(self) -> None:
stat1 = self.make_dummy_count_stat('stat1')
stat2 = self.make_dummy_count_stat('stat2')
query = lambda kwargs: SQL("""
INSERT INTO analytics_realmcount (realm_id, value, property, end_time)
VALUES ({default_realm_id}, 1, {property}, %(time_end)s)
""").format(
default_realm_id=Literal(self.default_realm.id),
property=Literal('stat3'),
)
query = """INSERT INTO analytics_realmcount (realm_id, value, property, end_time)
VALUES (%s, 1, '%s', %%%%(time_end)s)""" % (self.default_realm.id, 'stat3')
stat3 = DependentCountStat('stat3', sql_data_collector(RealmCount, query, None),
CountStat.HOUR,
dependencies=['stat1', 'stat2'])
@@ -351,13 +296,8 @@ class TestProcessCountStat(AnalyticsTestCase):
self.assertFillStateEquals(stat3, hour[2])
# test daily dependent stat with hourly dependencies
query = lambda kwargs: SQL("""
INSERT INTO analytics_realmcount (realm_id, value, property, end_time)
VALUES ({default_realm_id}, 1, {property}, %(time_end)s)
""").format(
default_realm_id=Literal(self.default_realm.id),
property=Literal('stat4'),
)
query = """INSERT INTO analytics_realmcount (realm_id, value, property, end_time)
VALUES (%s, 1, '%s', %%%%(time_end)s)""" % (self.default_realm.id, 'stat4')
stat4 = DependentCountStat('stat4', sql_data_collector(RealmCount, query, None),
CountStat.DAY,
dependencies=['stat1', 'stat2'])
@@ -380,10 +320,10 @@ class TestCountStats(AnalyticsTestCase):
date_created=self.TIME_ZERO-2*self.DAY)
for minutes_ago in [0, 1, 61, 60*24+1]:
creation_time = self.TIME_ZERO - minutes_ago*self.MINUTE
user = self.create_user(email=f'user-{minutes_ago}@second.analytics',
user = self.create_user(email='user-%s@second.analytics' % (minutes_ago,),
realm=self.second_realm, date_joined=creation_time)
recipient = self.create_stream_with_recipient(
name=f'stream {minutes_ago}', realm=self.second_realm,
name='stream %s' % (minutes_ago,), realm=self.second_realm,
date_created=creation_time)[1]
self.create_message(user, recipient, date_sent=creation_time)
self.hourly_user = get_user('user-1@second.analytics', self.second_realm)
@@ -423,29 +363,6 @@ class TestCountStats(AnalyticsTestCase):
self.assertTableState(UserCount, [], [])
self.assertTableState(StreamCount, [], [])
def test_active_users_by_is_bot_for_realm_constraint(self) -> None:
# For single Realm
COUNT_STATS = get_count_stats(self.default_realm)
stat = COUNT_STATS['active_users:is_bot:day']
self.current_property = stat.property
# To be included
self.create_user(is_bot=True, date_joined=self.TIME_ZERO-25*self.HOUR)
self.create_user(is_bot=False)
# To be excluded
self.create_user(email='test@second.analytics',
realm=self.second_realm, date_joined=self.TIME_ZERO-2*self.DAY)
do_fill_count_stat_at_hour(stat, self.TIME_ZERO, self.default_realm)
self.assertTableState(RealmCount, ['value', 'subgroup'],
[[1, 'true'], [1, 'false']])
# No aggregation to InstallationCount with realm constraint
self.assertTableState(InstallationCount, ['value', 'subgroup'], [])
self.assertTableState(UserCount, [], [])
self.assertTableState(StreamCount, [], [])
def test_messages_sent_by_is_bot(self) -> None:
stat = COUNT_STATS['messages_sent:is_bot:hour']
self.current_property = stat.property
@@ -475,46 +392,6 @@ class TestCountStats(AnalyticsTestCase):
self.assertTableState(InstallationCount, ['value', 'subgroup'], [[3, 'false'], [3, 'true']])
self.assertTableState(StreamCount, [], [])
def test_messages_sent_by_is_bot_realm_constraint(self) -> None:
# For single Realm
COUNT_STATS = get_count_stats(self.default_realm)
stat = COUNT_STATS['messages_sent:is_bot:hour']
self.current_property = stat.property
bot = self.create_user(is_bot=True)
human1 = self.create_user()
human2 = self.create_user()
recipient_human1 = Recipient.objects.get(type_id=human1.id,
type=Recipient.PERSONAL)
recipient_stream = self.create_stream_with_recipient()[1]
recipient_huddle = self.create_huddle_with_recipient()[1]
# To be included
self.create_message(bot, recipient_human1)
self.create_message(bot, recipient_stream)
self.create_message(bot, recipient_huddle)
self.create_message(human1, recipient_human1)
self.create_message(human2, recipient_human1)
# To be excluded
self.create_message(self.hourly_user, recipient_human1)
self.create_message(self.hourly_user, recipient_stream)
self.create_message(self.hourly_user, recipient_huddle)
do_fill_count_stat_at_hour(stat, self.TIME_ZERO, self.default_realm)
self.assertTableState(UserCount, ['value', 'subgroup', 'user'],
[[1, 'false', human1], [1, 'false', human2],
[3, 'true', bot]])
self.assertTableState(RealmCount, ['value', 'subgroup', 'realm'],
[[2, 'false', self.default_realm],
[3, 'true', self.default_realm]])
# No aggregation to InstallationCount with realm constraint
self.assertTableState(InstallationCount, ['value', 'subgroup'], [])
self.assertTableState(StreamCount, [], [])
def test_messages_sent_by_message_type(self) -> None:
stat = COUNT_STATS['messages_sent:message_type:day']
self.current_property = stat.property
@@ -577,43 +454,6 @@ class TestCountStats(AnalyticsTestCase):
[2, 'huddle_message']])
self.assertTableState(StreamCount, [], [])
def test_messages_sent_by_message_type_realm_constraint(self) -> None:
# For single Realm
COUNT_STATS = get_count_stats(self.default_realm)
stat = COUNT_STATS['messages_sent:message_type:day']
self.current_property = stat.property
user = self.create_user()
user_recipient = Recipient.objects.get(type_id=user.id, type=Recipient.PERSONAL)
private_stream_recipient = self.create_stream_with_recipient(invite_only=True)[1]
stream_recipient = self.create_stream_with_recipient()[1]
huddle_recipient = self.create_huddle_with_recipient()[1]
# To be included
self.create_message(user, user_recipient)
self.create_message(user, private_stream_recipient)
self.create_message(user, stream_recipient)
self.create_message(user, huddle_recipient)
do_fill_count_stat_at_hour(stat, self.TIME_ZERO, self.default_realm)
# To be excluded
self.create_message(self.hourly_user, user_recipient)
self.create_message(self.hourly_user, private_stream_recipient)
self.create_message(self.hourly_user, stream_recipient)
self.create_message(self.hourly_user, huddle_recipient)
self.assertTableState(UserCount, ['value', 'subgroup', 'user'],
[[1, 'private_message', user], [1, 'private_stream', user],
[1, 'huddle_message', user], [1, 'public_stream', user]])
self.assertTableState(RealmCount, ['value', 'subgroup'],
[[1, 'private_message'], [1, 'private_stream'],
[1, 'public_stream'], [1, 'huddle_message']])
# No aggregation to InstallationCount with realm constraint
self.assertTableState(InstallationCount, ['value', 'subgroup'], [])
self.assertTableState(StreamCount, [], [])
def test_messages_sent_to_recipients_with_same_id(self) -> None:
stat = COUNT_STATS['messages_sent:message_type:day']
self.current_property = stat.property
@@ -668,42 +508,6 @@ class TestCountStats(AnalyticsTestCase):
[[4, website_client_id], [3, client2_id]])
self.assertTableState(StreamCount, [], [])
def test_messages_sent_by_client_realm_constraint(self) -> None:
# For single Realm
COUNT_STATS = get_count_stats(self.default_realm)
stat = COUNT_STATS['messages_sent:client:day']
self.current_property = stat.property
user1 = self.create_user(is_bot=True)
user2 = self.create_user()
recipient_user2 = Recipient.objects.get(type_id=user2.id, type=Recipient.PERSONAL)
client2 = Client.objects.create(name='client2')
# TO be included
self.create_message(user1, recipient_user2, sending_client=client2)
self.create_message(user2, recipient_user2, sending_client=client2)
self.create_message(user2, recipient_user2)
# To be excluded
self.create_message(self.hourly_user, recipient_user2, sending_client=client2)
self.create_message(self.hourly_user, recipient_user2, sending_client=client2)
self.create_message(self.hourly_user, recipient_user2)
do_fill_count_stat_at_hour(stat, self.TIME_ZERO, self.default_realm)
client2_id = str(client2.id)
website_client_id = str(get_client('website').id) # default for self.create_message
self.assertTableState(UserCount, ['value', 'subgroup', 'user'],
[[1, client2_id, user1], [1, client2_id, user2],
[1, website_client_id, user2]])
self.assertTableState(RealmCount, ['value', 'subgroup'],
[[1, website_client_id], [2, client2_id]])
# No aggregation to InstallationCount with realm constraint
self.assertTableState(InstallationCount, ['value', 'subgroup'], [])
self.assertTableState(StreamCount, [], [])
def test_messages_sent_to_stream_by_is_bot(self) -> None:
stat = COUNT_STATS['messages_in_stream:is_bot:day']
self.current_property = stat.property
@@ -741,39 +545,6 @@ class TestCountStats(AnalyticsTestCase):
self.assertTableState(InstallationCount, ['value', 'subgroup'], [[5, 'false'], [2, 'true']])
self.assertTableState(UserCount, [], [])
def test_messages_sent_to_stream_by_is_bot_realm_constraint(self) -> None:
# For single Realm
COUNT_STATS = get_count_stats(self.default_realm)
stat = COUNT_STATS['messages_in_stream:is_bot:day']
self.current_property = stat.property
human1 = self.create_user()
bot = self.create_user(is_bot=True)
realm = {'realm': self.second_realm}
stream1, recipient_stream1 = self.create_stream_with_recipient()
stream2, recipient_stream2 = self.create_stream_with_recipient(**realm)
# To be included
self.create_message(human1, recipient_stream1)
self.create_message(bot, recipient_stream1)
# To be excluded
self.create_message(self.hourly_user, recipient_stream2)
self.create_message(self.daily_user, recipient_stream2)
do_fill_count_stat_at_hour(stat, self.TIME_ZERO, self.default_realm)
self.assertTableState(StreamCount, ['value', 'subgroup', 'stream'],
[[1, 'false', stream1],
[1, 'true', stream1]])
self.assertTableState(RealmCount, ['value', 'subgroup', 'realm'],
[[1, 'false'], [1, 'true']])
# No aggregation to InstallationCount with realm constraint
self.assertTableState(InstallationCount, ['value', 'subgroup'], [])
self.assertTableState(UserCount, [], [])
def create_interval(self, user: UserProfile, start_offset: timedelta,
end_offset: timedelta) -> None:
UserActivityInterval.objects.create(
@@ -823,34 +594,6 @@ class TestCountStats(AnalyticsTestCase):
self.assertTableState(InstallationCount, ['value'], [[6]])
self.assertTableState(StreamCount, [], [])
def test_1day_actives_realm_constraint(self) -> None:
# For single Realm
COUNT_STATS = get_count_stats(self.default_realm)
stat = COUNT_STATS['1day_actives::day']
self.current_property = stat.property
_1day = 1*self.DAY - UserActivityInterval.MIN_INTERVAL_LENGTH
user1 = self.create_user()
user2 = self.create_user()
# To be included
self.create_interval(user1, 20*self.HOUR, 19*self.HOUR)
self.create_interval(user2, _1day + self.DAY, _1day)
# To be excluded
user3 = self.create_user(realm=self.second_realm)
self.create_interval(user3, 20*self.MINUTE, 19*self.MINUTE)
do_fill_count_stat_at_hour(stat, self.TIME_ZERO, self.default_realm)
self.assertTableState(UserCount, ['value', 'user'],
[[1, user2], [1, user2]])
self.assertTableState(RealmCount, ['value', 'realm'],
[[2, self.default_realm]])
# No aggregation to InstallationCount with realm constraint
self.assertTableState(InstallationCount, ['value'], [])
self.assertTableState(StreamCount, [], [])
def test_15day_actives(self) -> None:
stat = COUNT_STATS['15day_actives::day']
self.current_property = stat.property
@@ -894,36 +637,6 @@ class TestCountStats(AnalyticsTestCase):
self.assertTableState(InstallationCount, ['value'], [[6]])
self.assertTableState(StreamCount, [], [])
def test_15day_actives_realm_constraint(self) -> None:
# For single Realm
COUNT_STATS = get_count_stats(self.default_realm)
stat = COUNT_STATS['15day_actives::day']
self.current_property = stat.property
_15day = 15*self.DAY - UserActivityInterval.MIN_INTERVAL_LENGTH
user1 = self.create_user()
user2 = self.create_user()
user3 = self.create_user(realm=self.second_realm)
# To be included
self.create_interval(user1, _15day + self.DAY, _15day)
self.create_interval(user2, 20*self.HOUR, 19*self.HOUR)
# To be excluded
self.create_interval(user3, 20*self.HOUR, 19*self.HOUR)
do_fill_count_stat_at_hour(stat, self.TIME_ZERO, self.default_realm)
self.assertTableState(UserCount, ['value', 'user'],
[[1, user1], [1, user2]])
self.assertTableState(RealmCount, ['value', 'realm'],
[[2, self.default_realm]])
# No aggregation to InstallationCount with realm constraint
self.assertTableState(InstallationCount, ['value'], [])
self.assertTableState(StreamCount, [], [])
def test_minutes_active(self) -> None:
stat = COUNT_STATS['minutes_active::day']
self.current_property = stat.property
@@ -966,35 +679,6 @@ class TestCountStats(AnalyticsTestCase):
self.assertTableState(InstallationCount, ['value'], [[61 + 121 + 24*60 + 1]])
self.assertTableState(StreamCount, [], [])
def test_minutes_active_realm_constraint(self) -> None:
# For single Realm
COUNT_STATS = get_count_stats(self.default_realm)
stat = COUNT_STATS['minutes_active::day']
self.current_property = stat.property
# Outside time range, should not appear. Also testing for intervals
# starting and ending on boundary
user1 = self.create_user()
user2 = self.create_user()
user3 = self.create_user(realm=self.second_realm)
# To be included
self.create_interval(user1, 20*self.HOUR, 19*self.HOUR)
self.create_interval(user2, 20*self.MINUTE, 19*self.MINUTE)
# To be excluded
self.create_interval(user3, 20*self.MINUTE, 19*self.MINUTE)
do_fill_count_stat_at_hour(stat, self.TIME_ZERO, self.default_realm)
self.assertTableState(UserCount, ['value', 'user'],
[[60, user1], [1, user2]])
self.assertTableState(RealmCount, ['value', 'realm'],
[[60 + 1, self.default_realm]])
# No aggregation to InstallationCount with realm constraint
self.assertTableState(InstallationCount, ['value'], [])
self.assertTableState(StreamCount, [], [])
class TestDoAggregateToSummaryTable(AnalyticsTestCase):
# do_aggregate_to_summary_table is mostly tested by the end to end
# nature of the tests in TestCountStats. But want to highlight one
@@ -1061,12 +745,12 @@ class TestDoIncrementLoggingStat(AnalyticsTestCase):
self.current_property = 'test'
self.assertTableState(RealmCount, ['value', 'subgroup', 'end_time'],
[[1, 'subgroup1', self.TIME_ZERO], [1, 'subgroup2', self.TIME_ZERO],
[1, 'subgroup1', self.TIME_LAST_HOUR]])
[1, 'subgroup1', self.TIME_LAST_HOUR]])
# This should trigger the get part of get_or_create
do_increment_logging_stat(self.default_realm, stat, 'subgroup1', self.TIME_ZERO)
self.assertTableState(RealmCount, ['value', 'subgroup', 'end_time'],
[[2, 'subgroup1', self.TIME_ZERO], [1, 'subgroup2', self.TIME_ZERO],
[1, 'subgroup1', self.TIME_LAST_HOUR]])
[1, 'subgroup1', self.TIME_LAST_HOUR]])
def test_increment(self) -> None:
stat = LoggingCountStat('test', RealmCount, CountStat.DAY)
@@ -1103,7 +787,7 @@ class TestLoggingCountStats(AnalyticsTestCase):
def test_active_users_log_by_is_bot(self) -> None:
property = 'active_users_log:is_bot:day'
user = do_create_user('email', 'password', self.default_realm, 'full_name')
user = do_create_user('email', 'password', self.default_realm, 'full_name', 'short_name')
self.assertEqual(1, RealmCount.objects.filter(property=property, subgroup=False)
.aggregate(Sum('value'))['value__sum'])
do_deactivate_user(user)
@@ -1158,39 +842,6 @@ class TestLoggingCountStats(AnalyticsTestCase):
do_resend_user_invite_email(PreregistrationUser.objects.first())
assertInviteCountEquals(6)
def test_messages_read_hour(self) -> None:
read_count_property = 'messages_read::hour'
interactions_property = 'messages_read_interactions::hour'
user1 = self.create_user()
user2 = self.create_user()
stream, recipient = self.create_stream_with_recipient()
self.subscribe(user1, stream.name)
self.subscribe(user2, stream.name)
self.send_personal_message(user1, user2)
client = get_client("website")
do_mark_all_as_read(user2, client)
self.assertEqual(1, UserCount.objects.filter(property=read_count_property)
.aggregate(Sum('value'))['value__sum'])
self.assertEqual(1, UserCount.objects.filter(property=interactions_property)
.aggregate(Sum('value'))['value__sum'])
self.send_stream_message(user1, stream.name)
self.send_stream_message(user1, stream.name)
do_mark_stream_messages_as_read(user2, client, stream)
self.assertEqual(3, UserCount.objects.filter(property=read_count_property)
.aggregate(Sum('value'))['value__sum'])
self.assertEqual(2, UserCount.objects.filter(property=interactions_property)
.aggregate(Sum('value'))['value__sum'])
message = self.send_stream_message(user2, stream.name)
do_update_message_flags(user1, client, 'add', 'read', [message])
self.assertEqual(4, UserCount.objects.filter(property=read_count_property)
.aggregate(Sum('value'))['value__sum'])
self.assertEqual(3, UserCount.objects.filter(property=interactions_property)
.aggregate(Sum('value'))['value__sum'])
class TestDeleteStats(AnalyticsTestCase):
def test_do_drop_all_analytics_tables(self) -> None:
user = self.create_user()
@@ -1367,10 +1018,10 @@ class TestActiveUsersAudit(AnalyticsTestCase):
[[user1, 'false'], [user2, 'false']])
def test_end_to_end_with_actions_dot_py(self) -> None:
user1 = do_create_user('email1', 'password', self.default_realm, 'full_name')
user2 = do_create_user('email2', 'password', self.default_realm, 'full_name')
user3 = do_create_user('email3', 'password', self.default_realm, 'full_name')
user4 = do_create_user('email4', 'password', self.default_realm, 'full_name')
user1 = do_create_user('email1', 'password', self.default_realm, 'full_name', 'short_name')
user2 = do_create_user('email2', 'password', self.default_realm, 'full_name', 'short_name')
user3 = do_create_user('email3', 'password', self.default_realm, 'full_name', 'short_name')
user4 = do_create_user('email4', 'password', self.default_realm, 'full_name', 'short_name')
do_deactivate_user(user2)
do_activate_user(user3)
do_reactivate_user(user4)
@@ -1452,7 +1103,6 @@ class TestRealmActiveHumans(AnalyticsTestCase):
self.create_user(realm=third_realm)
RealmCount.objects.all().delete()
InstallationCount.objects.all().delete()
for i in [-1, 0, 1]:
do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO + i*self.DAY)
self.assertTableState(RealmCount, ['value', 'realm', 'end_time'],
@@ -1462,9 +1112,9 @@ class TestRealmActiveHumans(AnalyticsTestCase):
[2, second_realm, self.TIME_ZERO - self.DAY]])
def test_end_to_end(self) -> None:
user1 = do_create_user('email1', 'password', self.default_realm, 'full_name')
user2 = do_create_user('email2', 'password', self.default_realm, 'full_name')
do_create_user('email3', 'password', self.default_realm, 'full_name')
user1 = do_create_user('email1', 'password', self.default_realm, 'full_name', 'short_name')
user2 = do_create_user('email2', 'password', self.default_realm, 'full_name', 'short_name')
do_create_user('email3', 'password', self.default_realm, 'full_name', 'short_name')
time_zero = floor_to_day(timezone_now()) + self.DAY
update_user_activity_interval(user1, time_zero)
update_user_activity_interval(user2, time_zero)

View File

@@ -2,7 +2,6 @@ from analytics.lib.counts import CountStat
from analytics.lib.fixtures import generate_time_series_data
from zerver.lib.test_classes import ZulipTestCase
# A very light test suite; the code being tested is not run in production.
class TestFixtures(ZulipTestCase):
def test_deterministic_settings(self) -> None:

View File

@@ -1,27 +1,28 @@
from datetime import datetime, timedelta, timezone
from datetime import datetime, timedelta
from typing import List, Optional
from unittest import mock
import ujson
import mock
from django.utils.timezone import utc
from django.http import HttpResponse
from django.utils.timezone import now as timezone_now
import ujson
from analytics.lib.counts import COUNT_STATS, CountStat
from analytics.lib.time_utils import time_range
from analytics.models import FillState, RealmCount, UserCount, last_successful_fill
from analytics.views import rewrite_client_arrays, sort_by_totals, sort_client_labels
from corporate.models import get_customer_by_realm
from zerver.lib.actions import do_create_multiuse_invite_link, do_send_realm_reactivation_email
from analytics.models import FillState, \
RealmCount, UserCount, last_successful_fill
from analytics.views import rewrite_client_arrays, \
sort_by_totals, sort_client_labels
from zerver.lib.test_classes import ZulipTestCase
from zerver.lib.test_helpers import reset_emails_in_zulip_realm
from zerver.lib.timestamp import ceiling_to_day, ceiling_to_hour, datetime_to_timestamp
from zerver.models import Client, MultiuseInvite, PreregistrationUser, get_realm
from zerver.lib.timestamp import ceiling_to_day, \
ceiling_to_hour, datetime_to_timestamp
from zerver.lib.actions import do_create_multiuse_invite_link, \
do_send_realm_reactivation_email
from zerver.models import Client, get_realm, MultiuseInvite
class TestStatsEndpoint(ZulipTestCase):
def test_stats(self) -> None:
self.user = self.example_user('hamlet')
self.login_user(self.user)
self.login(self.user.email)
result = self.client_get('/stats')
self.assertEqual(result.status_code, 200)
# Check that we get something back
@@ -29,7 +30,7 @@ class TestStatsEndpoint(ZulipTestCase):
def test_guest_user_cant_access_stats(self) -> None:
self.user = self.example_user('polonius')
self.login_user(self.user)
self.login(self.user.email)
result = self.client_get('/stats')
self.assert_json_error(result, "Not allowed for guest users", 400)
@@ -37,15 +38,15 @@ class TestStatsEndpoint(ZulipTestCase):
self.assert_json_error(result, "Not allowed for guest users", 400)
def test_stats_for_realm(self) -> None:
user = self.example_user('hamlet')
self.login_user(user)
user_profile = self.example_user('hamlet')
self.login(user_profile.email)
result = self.client_get('/stats/realm/zulip/')
self.assertEqual(result.status_code, 302)
user = self.example_user('hamlet')
user.is_staff = True
user.save(update_fields=['is_staff'])
user_profile = self.example_user('hamlet')
user_profile.is_staff = True
user_profile.save(update_fields=['is_staff'])
result = self.client_get('/stats/realm/not_existing_realm/')
self.assertEqual(result.status_code, 302)
@@ -55,15 +56,15 @@ class TestStatsEndpoint(ZulipTestCase):
self.assert_in_response("Zulip analytics for", result)
def test_stats_for_installation(self) -> None:
user = self.example_user('hamlet')
self.login_user(user)
user_profile = self.example_user('hamlet')
self.login(user_profile.email)
result = self.client_get('/stats/installation')
self.assertEqual(result.status_code, 302)
user = self.example_user('hamlet')
user.is_staff = True
user.save(update_fields=['is_staff'])
user_profile = self.example_user('hamlet')
user_profile.is_staff = True
user_profile.save(update_fields=['is_staff'])
result = self.client_get('/stats/installation')
self.assertEqual(result.status_code, 200)
@@ -74,7 +75,7 @@ class TestGetChartData(ZulipTestCase):
super().setUp()
self.realm = get_realm('zulip')
self.user = self.example_user('hamlet')
self.login_user(self.user)
self.login(self.user.email)
self.end_times_hour = [ceiling_to_hour(self.realm.date_created) + timedelta(hours=i)
for i in range(4)]
self.end_times_day = [ceiling_to_day(self.realm.date_created) + timedelta(days=i)
@@ -182,23 +183,6 @@ class TestGetChartData(ZulipTestCase):
'result': 'success',
})
def test_messages_read_over_time(self) -> None:
stat = COUNT_STATS['messages_read::hour']
self.insert_data(stat, [None], [])
result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'messages_read_over_time'})
self.assert_json_success(result)
data = result.json()
self.assertEqual(data, {
'msg': '',
'end_times': [datetime_to_timestamp(dt) for dt in self.end_times_hour],
'frequency': CountStat.HOUR,
'everyone': {'read': self.data(100)},
'user': {'read': self.data(0)},
'display_order': None,
'result': 'success',
})
def test_include_empty_subgroups(self) -> None:
FillState.objects.create(
property='realm_active_humans::day', end_time=self.end_times_day[0],
@@ -301,86 +285,24 @@ class TestGetChartData(ZulipTestCase):
self.assert_json_error_contains(result, 'Unknown chart name')
def test_analytics_not_running(self) -> None:
realm = get_realm("zulip")
self.assertEqual(FillState.objects.count(), 0)
realm.date_created = timezone_now() - timedelta(days=3)
realm.save(update_fields=["date_created"])
# try to get data for a valid chart, but before we've put anything in the database
# (e.g. before update_analytics_counts has been run)
with mock.patch('logging.warning'):
result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'messages_sent_over_time'})
{'chart_name': 'number_of_humans'})
self.assert_json_error_contains(result, 'No analytics data available')
realm.date_created = timezone_now() - timedelta(days=1, hours=2)
realm.save(update_fields=["date_created"])
with mock.patch('logging.warning'):
result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'messages_sent_over_time'})
self.assert_json_error_contains(result, 'No analytics data available')
realm.date_created = timezone_now() - timedelta(days=1, minutes=10)
realm.save(update_fields=["date_created"])
result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'messages_sent_over_time'})
self.assert_json_success(result)
realm.date_created = timezone_now() - timedelta(hours=10)
realm.save(update_fields=["date_created"])
result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'messages_sent_over_time'})
self.assert_json_success(result)
end_time = timezone_now() - timedelta(days=5)
fill_state = FillState.objects.create(property='messages_sent:is_bot:hour', end_time=end_time,
state=FillState.DONE)
realm.date_created = timezone_now() - timedelta(days=3)
realm.save(update_fields=["date_created"])
with mock.patch('logging.warning'):
result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'messages_sent_over_time'})
self.assert_json_error_contains(result, 'No analytics data available')
realm.date_created = timezone_now() - timedelta(days=1, minutes=10)
realm.save(update_fields=["date_created"])
result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'messages_sent_over_time'})
self.assert_json_success(result)
end_time = timezone_now() - timedelta(days=2)
fill_state.end_time = end_time
fill_state.save(update_fields=["end_time"])
realm.date_created = timezone_now() - timedelta(days=3)
realm.save(update_fields=["date_created"])
result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'messages_sent_over_time'})
self.assert_json_success(result)
realm.date_created = timezone_now() - timedelta(days=1, hours=2)
realm.save(update_fields=["date_created"])
with mock.patch('logging.warning'):
result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'messages_sent_over_time'})
self.assert_json_error_contains(result, 'No analytics data available')
realm.date_created = timezone_now() - timedelta(days=1, minutes=10)
realm.save(update_fields=["date_created"])
result = self.client_get('/json/analytics/chart_data', {'chart_name': 'messages_sent_over_time'})
self.assert_json_success(result)
def test_get_chart_data_for_realm(self) -> None:
user = self.example_user('hamlet')
self.login_user(user)
user_profile = self.example_user('hamlet')
self.login(user_profile.email)
result = self.client_get('/json/analytics/chart_data/realm/zulip',
result = self.client_get('/json/analytics/chart_data/realm/zulip/',
{'chart_name': 'number_of_humans'})
self.assert_json_error(result, "Must be an server administrator", 400)
user = self.example_user('hamlet')
user.is_staff = True
user.save(update_fields=['is_staff'])
user_profile = self.example_user('hamlet')
user_profile.is_staff = True
user_profile.save(update_fields=['is_staff'])
stat = COUNT_STATS['realm_active_humans::day']
self.insert_data(stat, [None], [])
@@ -393,16 +315,16 @@ class TestGetChartData(ZulipTestCase):
self.assert_json_success(result)
def test_get_chart_data_for_installation(self) -> None:
user = self.example_user('hamlet')
self.login_user(user)
user_profile = self.example_user('hamlet')
self.login(user_profile.email)
result = self.client_get('/json/analytics/chart_data/installation',
{'chart_name': 'number_of_humans'})
self.assert_json_error(result, "Must be an server administrator", 400)
user = self.example_user('hamlet')
user.is_staff = True
user.save(update_fields=['is_staff'])
user_profile = self.example_user('hamlet')
user_profile.is_staff = True
user_profile.save(update_fields=['is_staff'])
stat = COUNT_STATS['realm_active_humans::day']
self.insert_data(stat, [None], [])
@@ -412,18 +334,16 @@ class TestGetChartData(ZulipTestCase):
class TestSupportEndpoint(ZulipTestCase):
def test_search(self) -> None:
reset_emails_in_zulip_realm()
def check_hamlet_user_query_result(result: HttpResponse) -> None:
self.assert_in_success_response(['<span class="label">user</span>\n', '<h3>King Hamlet</h3>',
'<b>Email</b>: hamlet@zulip.com', '<b>Is active</b>: True<br>',
'<b>Admins</b>: desdemona@zulip.com, iago@zulip.com\n',
'class="copy-button" data-copytext="desdemona@zulip.com, iago@zulip.com"',
'<b>Admins</b>: iago@zulip.com\n',
'class="copy-button" data-copytext="iago@zulip.com"'
], result)
def check_zulip_realm_query_result(result: HttpResponse) -> None:
zulip_realm = get_realm("zulip")
self.assert_in_success_response([f'<input type="hidden" name="realm_id" value="{zulip_realm.id}"',
self.assert_in_success_response(['<input type="hidden" name="realm_id" value="%s"' % (zulip_realm.id,),
'Zulip Dev</h3>',
'<option value="1" selected>Self Hosted</option>',
'<option value="2" >Limited</option>',
@@ -435,7 +355,7 @@ class TestSupportEndpoint(ZulipTestCase):
def check_lear_realm_query_result(result: HttpResponse) -> None:
lear_realm = get_realm("lear")
self.assert_in_success_response([f'<input type="hidden" name="realm_id" value="{lear_realm.id}"',
self.assert_in_success_response(['<input type="hidden" name="realm_id" value="%s"' % (lear_realm.id,),
'Lear &amp; Co.</h3>',
'<option value="1" selected>Self Hosted</option>',
'<option value="2" >Limited</option>',
@@ -445,9 +365,9 @@ class TestSupportEndpoint(ZulipTestCase):
'scrub-realm-button">',
'data-string-id="lear"'], result)
def check_preregistration_user_query_result(result: HttpResponse, email: str, invite: bool=False) -> None:
def check_preregistration_user_query_result(result: HttpResponse, email: str, invite: Optional[bool]=False) -> None:
self.assert_in_success_response(['<span class="label">preregistration user</span>\n',
f'<b>Email</b>: {email}',
'<b>Email</b>: {}'.format(email),
], result)
if invite:
self.assert_in_success_response(['<span class="label">invite</span>'], result)
@@ -462,29 +382,31 @@ class TestSupportEndpoint(ZulipTestCase):
def check_realm_creation_query_result(result: HttpResponse, email: str) -> None:
self.assert_in_success_response(['<span class="label">preregistration user</span>\n',
'<span class="label">realm creation</span>\n',
'<b>Link</b>: http://testserver/accounts/do_confirm/',
'<b>Expires in</b>: 1\xa0day<br>\n',
'<b>Link</b>: http://zulip.testserver/accounts/do_confirm/',
'<b>Expires in</b>: 1\xa0day<br>\n'
], result)
def check_multiuse_invite_link_query_result(result: HttpResponse) -> None:
self.assert_in_success_response(['<span class="label">multiuse invite</span>\n',
'<b>Link</b>: http://zulip.testserver/join/',
'<b>Expires in</b>: 1\xa0week, 3',
'<b>Expires in</b>: 1\xa0week, 3'
], result)
def check_realm_reactivation_link_query_result(result: HttpResponse) -> None:
self.assert_in_success_response(['<span class="label">realm reactivation</span>\n',
'<b>Link</b>: http://zulip.testserver/reactivate/',
'<b>Expires in</b>: 1\xa0day',
'<b>Expires in</b>: 1\xa0day'
], result)
self.login('cordelia')
cordelia_email = self.example_email("cordelia")
self.login(cordelia_email)
result = self.client_get("/activity/support")
self.assertEqual(result.status_code, 302)
self.assertEqual(result["Location"], "/login/")
self.login('iago')
iago_email = self.example_email("iago")
self.login(iago_email)
result = self.client_get("/activity/support")
self.assert_in_success_response(['<input type="text" name="q" class="input-xxlarge search-query"'], result)
@@ -514,7 +436,7 @@ class TestSupportEndpoint(ZulipTestCase):
check_lear_realm_query_result(result)
self.client_post('/accounts/home/', {'email': self.nonreg_email("test")})
self.login('iago')
self.login(iago_email)
result = self.client_get("/activity/support", {"q": self.nonreg_email("test")})
check_preregistration_user_query_result(result, self.nonreg_email("test"))
check_zulip_realm_query_result(result)
@@ -522,8 +444,7 @@ class TestSupportEndpoint(ZulipTestCase):
stream_ids = [self.get_stream_id("Denmark")]
invitee_emails = [self.nonreg_email("test1")]
self.client_post("/json/invites", {"invitee_emails": invitee_emails,
"stream_ids": ujson.dumps(stream_ids),
"invite_as": PreregistrationUser.INVITE_AS['MEMBER']})
"stream_ids": ujson.dumps(stream_ids), "invite_as": 1})
result = self.client_get("/activity/support", {"q": self.nonreg_email("test1")})
check_preregistration_user_query_result(result, self.nonreg_email("test1"), invite=True)
check_zulip_realm_query_result(result)
@@ -545,106 +466,79 @@ class TestSupportEndpoint(ZulipTestCase):
check_zulip_realm_query_result(result)
def test_change_plan_type(self) -> None:
cordelia = self.example_user('cordelia')
self.login_user(cordelia)
cordelia = self.example_user("cordelia")
self.login(cordelia.email)
result = self.client_post("/activity/support", {"realm_id": f"{cordelia.realm_id}", "plan_type": "2"})
result = self.client_post("/activity/support", {"realm_id": "%s" % (cordelia.realm_id,), "plan_type": "2"})
self.assertEqual(result.status_code, 302)
self.assertEqual(result["Location"], "/login/")
iago = self.example_user("iago")
self.login_user(iago)
self.login(iago.email)
with mock.patch("analytics.views.do_change_plan_type") as m:
result = self.client_post("/activity/support", {"realm_id": f"{iago.realm_id}", "plan_type": "2"})
result = self.client_post("/activity/support", {"realm_id": "%s" % (iago.realm_id,), "plan_type": "2"})
m.assert_called_once_with(get_realm("zulip"), 2)
self.assert_in_success_response(["Plan type of Zulip Dev changed from self hosted to limited"], result)
def test_attach_discount(self) -> None:
cordelia = self.example_user('cordelia')
lear_realm = get_realm('lear')
self.login_user(cordelia)
lear_realm = get_realm("lear")
cordelia_email = self.example_email("cordelia")
self.login(cordelia_email)
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}", "discount": "25"})
result = self.client_post("/activity/support", {"realm_id": "%s" % (lear_realm.id,), "discount": "25"})
self.assertEqual(result.status_code, 302)
self.assertEqual(result["Location"], "/login/")
self.login('iago')
iago_email = self.example_email("iago")
self.login(iago_email)
with mock.patch("analytics.views.attach_discount_to_realm") as m:
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}", "discount": "25"})
result = self.client_post("/activity/support", {"realm_id": "%s" % (lear_realm.id,), "discount": "25"})
m.assert_called_once_with(get_realm("lear"), 25)
self.assert_in_success_response(["Discount of Lear &amp; Co. changed to 25 from None"], result)
def test_change_sponsorship_status(self) -> None:
lear_realm = get_realm("lear")
self.assertIsNone(get_customer_by_realm(lear_realm))
cordelia = self.example_user('cordelia')
self.login_user(cordelia)
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}",
"sponsorship_pending": "true"})
self.assertEqual(result.status_code, 302)
self.assertEqual(result["Location"], "/login/")
iago = self.example_user("iago")
self.login_user(iago)
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}",
"sponsorship_pending": "true"})
self.assert_in_success_response(["Lear &amp; Co. marked as pending sponsorship."], result)
customer = get_customer_by_realm(lear_realm)
assert(customer is not None)
self.assertTrue(customer.sponsorship_pending)
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}",
"sponsorship_pending": "false"})
self.assert_in_success_response(["Lear &amp; Co. is no longer pending sponsorship."], result)
customer = get_customer_by_realm(lear_realm)
assert(customer is not None)
self.assertFalse(customer.sponsorship_pending)
def test_activate_or_deactivate_realm(self) -> None:
cordelia = self.example_user('cordelia')
lear_realm = get_realm('lear')
self.login_user(cordelia)
lear_realm = get_realm("lear")
cordelia_email = self.example_email("cordelia")
self.login(cordelia_email)
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}", "status": "deactivated"})
result = self.client_post("/activity/support", {"realm_id": "%s" % (lear_realm.id,), "status": "deactivated"})
self.assertEqual(result.status_code, 302)
self.assertEqual(result["Location"], "/login/")
self.login('iago')
iago_email = self.example_email("iago")
self.login(iago_email)
with mock.patch("analytics.views.do_deactivate_realm") as m:
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}", "status": "deactivated"})
result = self.client_post("/activity/support", {"realm_id": "%s" % (lear_realm.id,), "status": "deactivated"})
m.assert_called_once_with(lear_realm, self.example_user("iago"))
self.assert_in_success_response(["Lear &amp; Co. deactivated"], result)
with mock.patch("analytics.views.do_send_realm_reactivation_email") as m:
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}", "status": "active"})
result = self.client_post("/activity/support", {"realm_id": "%s" % (lear_realm.id,), "status": "active"})
m.assert_called_once_with(lear_realm)
self.assert_in_success_response(["Realm reactivation email sent to admins of Lear"], result)
def test_scrub_realm(self) -> None:
cordelia = self.example_user('cordelia')
lear_realm = get_realm('lear')
self.login_user(cordelia)
lear_realm = get_realm("lear")
cordelia_email = self.example_email("cordelia")
self.login(cordelia_email)
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}", "discount": "25"})
result = self.client_post("/activity/support", {"realm_id": "%s" % (lear_realm.id,), "discount": "25"})
self.assertEqual(result.status_code, 302)
self.assertEqual(result["Location"], "/login/")
self.login('iago')
iago_email = self.example_email("iago")
self.login(iago_email)
with mock.patch("analytics.views.do_scrub_realm") as m:
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}", "scrub_realm": "scrub_realm"})
m.assert_called_once_with(lear_realm, acting_user=self.example_user("iago"))
result = self.client_post("/activity/support", {"realm_id": "%s" % (lear_realm.id,), "scrub_realm": "scrub_realm"})
m.assert_called_once_with(lear_realm)
self.assert_in_success_response(["Lear &amp; Co. scrubbed"], result)
with mock.patch("analytics.views.do_scrub_realm") as m:
with self.assertRaises(AssertionError):
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}"})
result = self.client_post("/activity/support", {"realm_id": "%s" % (lear_realm.id,)})
m.assert_not_called()
class TestGetChartDataHelpers(ZulipTestCase):
@@ -652,8 +546,8 @@ class TestGetChartDataHelpers(ZulipTestCase):
# the only function that uses it at the moment
def test_last_successful_fill(self) -> None:
self.assertIsNone(last_successful_fill('non-existant'))
a_time = datetime(2016, 3, 14, 19, tzinfo=timezone.utc)
one_hour_before = datetime(2016, 3, 14, 18, tzinfo=timezone.utc)
a_time = datetime(2016, 3, 14, 19).replace(tzinfo=utc)
one_hour_before = datetime(2016, 3, 14, 18).replace(tzinfo=utc)
fillstate = FillState.objects.create(property='property', end_time=a_time,
state=FillState.DONE)
self.assertEqual(last_successful_fill('property'), a_time)
@@ -662,7 +556,7 @@ class TestGetChartDataHelpers(ZulipTestCase):
self.assertEqual(last_successful_fill('property'), one_hour_before)
def test_sort_by_totals(self) -> None:
empty: List[int] = []
empty = [] # type: List[int]
value_arrays = {'c': [0, 1], 'a': [9], 'b': [1, 1, 1], 'd': empty}
self.assertEqual(sort_by_totals(value_arrays), ['a', 'b', 'c', 'd'])
@@ -676,9 +570,9 @@ class TestTimeRange(ZulipTestCase):
HOUR = timedelta(hours=1)
DAY = timedelta(days=1)
a_time = datetime(2016, 3, 14, 22, 59, tzinfo=timezone.utc)
floor_hour = datetime(2016, 3, 14, 22, tzinfo=timezone.utc)
floor_day = datetime(2016, 3, 14, tzinfo=timezone.utc)
a_time = datetime(2016, 3, 14, 22, 59).replace(tzinfo=utc)
floor_hour = datetime(2016, 3, 14, 22).replace(tzinfo=utc)
floor_day = datetime(2016, 3, 14).replace(tzinfo=utc)
# test start == end
self.assertEqual(time_range(a_time, a_time, CountStat.HOUR, None), [])

View File

@@ -1,34 +1,33 @@
from django.conf.urls import include
from django.urls import path
from django.conf.urls import include, url
import analytics.views
from zerver.lib.rest import rest_dispatch
i18n_urlpatterns = [
# Server admin (user_profile.is_staff) visible stats pages
path('activity', analytics.views.get_activity,
name='analytics.views.get_activity'),
path('activity/support', analytics.views.support,
name='analytics.views.support'),
path('realm_activity/<str:realm_str>/', analytics.views.get_realm_activity,
name='analytics.views.get_realm_activity'),
path('user_activity/<str:email>/', analytics.views.get_user_activity,
name='analytics.views.get_user_activity'),
url(r'^activity$', analytics.views.get_activity,
name='analytics.views.get_activity'),
url(r'^activity/support$', analytics.views.support,
name='analytics.views.support'),
url(r'^realm_activity/(?P<realm_str>[\S]+)/$', analytics.views.get_realm_activity,
name='analytics.views.get_realm_activity'),
url(r'^user_activity/(?P<email>[\S]+)/$', analytics.views.get_user_activity,
name='analytics.views.get_user_activity'),
path('stats/realm/<str:realm_str>/', analytics.views.stats_for_realm,
name='analytics.views.stats_for_realm'),
path('stats/installation', analytics.views.stats_for_installation,
name='analytics.views.stats_for_installation'),
path('stats/remote/<int:remote_server_id>/installation',
analytics.views.stats_for_remote_installation,
name='analytics.views.stats_for_remote_installation'),
path('stats/remote/<int:remote_server_id>/realm/<int:remote_realm_id>/',
analytics.views.stats_for_remote_realm,
name='analytics.views.stats_for_remote_realm'),
url(r'^stats/realm/(?P<realm_str>[\S]+)/$', analytics.views.stats_for_realm,
name='analytics.views.stats_for_realm'),
url(r'^stats/installation$', analytics.views.stats_for_installation,
name='analytics.views.stats_for_installation'),
url(r'^stats/remote/(?P<remote_server_id>[\S]+)/installation$',
analytics.views.stats_for_remote_installation,
name='analytics.views.stats_for_remote_installation'),
url(r'^stats/remote/(?P<remote_server_id>[\S]+)/realm/(?P<remote_realm_id>[\S]+)/$',
analytics.views.stats_for_remote_realm,
name='analytics.views.stats_for_remote_realm'),
# User-visible stats page
path('stats', analytics.views.stats,
name='analytics.views.stats'),
url(r'^stats$', analytics.views.stats,
name='analytics.views.stats'),
]
# These endpoints are a part of the API (V1), which uses:
@@ -41,22 +40,22 @@ i18n_urlpatterns = [
# All of these paths are accessed by either a /json or /api prefix
v1_api_and_json_patterns = [
# get data for the graphs at /stats
path('analytics/chart_data', rest_dispatch,
{'GET': 'analytics.views.get_chart_data'}),
path('analytics/chart_data/realm/<str:realm_str>', rest_dispatch,
{'GET': 'analytics.views.get_chart_data_for_realm'}),
path('analytics/chart_data/installation', rest_dispatch,
{'GET': 'analytics.views.get_chart_data_for_installation'}),
path('analytics/chart_data/remote/<int:remote_server_id>/installation', rest_dispatch,
{'GET': 'analytics.views.get_chart_data_for_remote_installation'}),
path('analytics/chart_data/remote/<int:remote_server_id>/realm/<int:remote_realm_id>',
rest_dispatch,
{'GET': 'analytics.views.get_chart_data_for_remote_realm'}),
url(r'^analytics/chart_data$', rest_dispatch,
{'GET': 'analytics.views.get_chart_data'}),
url(r'^analytics/chart_data/realm/(?P<realm_str>[\S]+)$', rest_dispatch,
{'GET': 'analytics.views.get_chart_data_for_realm'}),
url(r'^analytics/chart_data/installation$', rest_dispatch,
{'GET': 'analytics.views.get_chart_data_for_installation'}),
url(r'^analytics/chart_data/remote/(?P<remote_server_id>[\S]+)/installation$', rest_dispatch,
{'GET': 'analytics.views.get_chart_data_for_remote_installation'}),
url(r'^analytics/chart_data/remote/(?P<remote_server_id>[\S]+)/realm/(?P<remote_realm_id>[\S]+)$',
rest_dispatch,
{'GET': 'analytics.views.get_chart_data_for_remote_realm'}),
]
i18n_urlpatterns += [
path('api/v1/', include(v1_api_and_json_patterns)),
path('json/', include(v1_api_and_json_patterns)),
url(r'^api/v1/', include(v1_api_and_json_patterns)),
url(r'^json/', include(v1_api_and_json_patterns)),
]
urlpatterns = i18n_urlpatterns

View File

@@ -4,95 +4,62 @@ import re
import time
import urllib
from collections import defaultdict
from datetime import datetime, timedelta, timezone
from datetime import datetime, timedelta
from decimal import Decimal
from typing import Any, Callable, Dict, List, Optional, Sequence, Set, Tuple, Type, Union
from typing import Any, Callable, Dict, List, \
Optional, Set, Tuple, Type, Union
import pytz
from django.conf import settings
from django.core.exceptions import ValidationError
from django.core.validators import URLValidator
from django.urls import reverse
from django.db import connection
from django.db.models.query import QuerySet
from django.http import HttpRequest, HttpResponse, HttpResponseNotFound
from django.shortcuts import render
from django.template import loader
from django.urls import reverse
from django.utils.timesince import timesince
from django.utils.timezone import now as timezone_now
from django.utils.timezone import now as timezone_now, utc as timezone_utc
from django.utils.translation import ugettext as _
from django.utils.timesince import timesince
from django.core.validators import URLValidator
from django.core.exceptions import ValidationError
from jinja2 import Markup as mark_safe
from psycopg2.sql import SQL, Composable, Literal
from analytics.lib.counts import COUNT_STATS, CountStat
from analytics.lib.time_utils import time_range
from analytics.models import (
BaseCount,
InstallationCount,
RealmCount,
StreamCount,
UserCount,
installation_epoch,
last_successful_fill,
)
from confirmation.models import Confirmation, _properties, confirmation_url
from confirmation.settings import STATUS_ACTIVE
from zerver.decorator import (
require_non_guest_user,
require_server_admin,
require_server_admin_api,
to_utc_datetime,
zulip_login_required,
)
from zerver.lib.actions import (
do_change_plan_type,
do_deactivate_realm,
do_scrub_realm,
do_send_realm_reactivation_email,
)
from analytics.models import BaseCount, InstallationCount, \
RealmCount, StreamCount, UserCount, last_successful_fill, installation_epoch
from confirmation.models import Confirmation, confirmation_url, _properties
from zerver.decorator import require_server_admin, require_server_admin_api, \
to_non_negative_int, to_utc_datetime, zulip_login_required, require_non_guest_user
from zerver.lib.exceptions import JsonableError
from zerver.lib.realm_icon import realm_icon_url
from zerver.lib.request import REQ, has_request_variables
from zerver.lib.response import json_success
from zerver.lib.subdomains import get_subdomain_from_hostname
from zerver.lib.timestamp import convert_to_UTC, timestamp_to_datetime
from zerver.lib.validator import to_non_negative_int
from zerver.models import (
Client,
MultiuseInvite,
PreregistrationUser,
Realm,
UserActivity,
UserActivityInterval,
UserProfile,
get_realm,
)
from zerver.lib.realm_icon import realm_icon_url
from zerver.views.invite import get_invitee_emails_set
from zerver.lib.subdomains import get_subdomain_from_hostname
from zerver.lib.actions import do_change_plan_type, do_deactivate_realm, \
do_send_realm_reactivation_email, do_scrub_realm
from confirmation.settings import STATUS_ACTIVE
if settings.BILLING_ENABLED:
from corporate.lib.stripe import (
attach_discount_to_realm,
get_customer_by_realm,
get_discount_for_realm,
update_sponsorship_status,
)
from corporate.lib.stripe import attach_discount_to_realm, get_discount_for_realm
from zerver.models import Client, get_realm, Realm, UserActivity, UserActivityInterval, \
UserProfile, PreregistrationUser, MultiuseInvite
if settings.ZILENCER_ENABLED:
from zilencer.models import RemoteInstallationCount, RemoteRealmCount, RemoteZulipServer
from zilencer.models import RemoteInstallationCount, RemoteRealmCount, \
RemoteZulipServer
else:
from unittest.mock import Mock
RemoteInstallationCount = Mock() # type: ignore[misc] # https://github.com/JukkaL/mypy/issues/1188
RemoteZulipServer = Mock() # type: ignore[misc] # https://github.com/JukkaL/mypy/issues/1188
RemoteRealmCount = Mock() # type: ignore[misc] # https://github.com/JukkaL/mypy/issues/1188
MAX_TIME_FOR_FULL_ANALYTICS_GENERATION = timedelta(days=1, minutes=30)
def is_analytics_ready(realm: Realm) -> bool:
return (timezone_now() - realm.date_created) > MAX_TIME_FOR_FULL_ANALYTICS_GENERATION
from mock import Mock
RemoteInstallationCount = Mock() # type: ignore # https://github.com/JukkaL/mypy/issues/1188
RemoteZulipServer = Mock() # type: ignore # https://github.com/JukkaL/mypy/issues/1188
RemoteRealmCount = Mock() # type: ignore # https://github.com/JukkaL/mypy/issues/1188
def render_stats(request: HttpRequest, data_url_suffix: str, target_name: str,
for_installation: bool=False, remote: bool=False,
analytics_ready: bool=True) -> HttpRequest:
for_installation: bool=False, remote: bool=False) -> HttpRequest:
page_params = dict(
data_url_suffix=data_url_suffix,
for_installation=for_installation,
@@ -102,8 +69,7 @@ def render_stats(request: HttpRequest, data_url_suffix: str, target_name: str,
return render(request,
'analytics/stats.html',
context=dict(target_name=target_name,
page_params=page_params,
analytics_ready=analytics_ready))
page_params=page_params))
@zulip_login_required
def stats(request: HttpRequest) -> HttpResponse:
@@ -112,8 +78,7 @@ def stats(request: HttpRequest) -> HttpResponse:
# TODO: Make @zulip_login_required pass the UserProfile so we
# can use @require_member_or_admin
raise JsonableError(_("Not allowed for guest users"))
return render_stats(request, '', realm.name or realm.string_id,
analytics_ready=is_analytics_ready(realm))
return render_stats(request, '', realm.name or realm.string_id)
@require_server_admin
@has_request_variables
@@ -121,18 +86,17 @@ def stats_for_realm(request: HttpRequest, realm_str: str) -> HttpResponse:
try:
realm = get_realm(realm_str)
except Realm.DoesNotExist:
return HttpResponseNotFound(f"Realm {realm_str} does not exist")
return HttpResponseNotFound("Realm %s does not exist" % (realm_str,))
return render_stats(request, f'/realm/{realm_str}', realm.name or realm.string_id,
analytics_ready=is_analytics_ready(realm))
return render_stats(request, '/realm/%s' % (realm_str,), realm.name or realm.string_id)
@require_server_admin
@has_request_variables
def stats_for_remote_realm(request: HttpRequest, remote_server_id: int,
remote_realm_id: int) -> HttpResponse:
def stats_for_remote_realm(request: HttpRequest, remote_server_id: str,
remote_realm_id: str) -> HttpResponse:
server = RemoteZulipServer.objects.get(id=remote_server_id)
return render_stats(request, f'/remote/{server.id}/realm/{remote_realm_id}',
f"Realm {remote_realm_id} on server {server.hostname}")
return render_stats(request, '/remote/%s/realm/%s' % (server.id, remote_realm_id),
"Realm %s on server %s" % (remote_realm_id, server.hostname))
@require_server_admin_api
@has_request_variables
@@ -148,8 +112,8 @@ def get_chart_data_for_realm(request: HttpRequest, user_profile: UserProfile,
@require_server_admin_api
@has_request_variables
def get_chart_data_for_remote_realm(
request: HttpRequest, user_profile: UserProfile, remote_server_id: int,
remote_realm_id: int, **kwargs: Any) -> HttpResponse:
request: HttpRequest, user_profile: UserProfile, remote_server_id: str,
remote_realm_id: str, **kwargs: Any) -> HttpResponse:
server = RemoteZulipServer.objects.get(id=remote_server_id)
return get_chart_data(request=request, user_profile=user_profile, server=server,
remote=True, remote_realm_id=int(remote_realm_id), **kwargs)
@@ -159,10 +123,10 @@ def stats_for_installation(request: HttpRequest) -> HttpResponse:
return render_stats(request, '/installation', 'Installation', True)
@require_server_admin
def stats_for_remote_installation(request: HttpRequest, remote_server_id: int) -> HttpResponse:
def stats_for_remote_installation(request: HttpRequest, remote_server_id: str) -> HttpResponse:
server = RemoteZulipServer.objects.get(id=remote_server_id)
return render_stats(request, f'/remote/{server.id}/installation',
f'remote Installation {server.hostname}', True, True)
return render_stats(request, '/remote/%s/installation' % (server.id,),
'remote Installation %s' % (server.hostname,), True, True)
@require_server_admin_api
@has_request_variables
@@ -175,7 +139,7 @@ def get_chart_data_for_installation(request: HttpRequest, user_profile: UserProf
def get_chart_data_for_remote_installation(
request: HttpRequest,
user_profile: UserProfile,
remote_server_id: int,
remote_server_id: str,
chart_name: str=REQ(),
**kwargs: Any) -> HttpResponse:
server = RemoteZulipServer.objects.get(id=remote_server_id)
@@ -211,10 +175,10 @@ def get_chart_data(request: HttpRequest, user_profile: UserProfile, chart_name:
COUNT_STATS['realm_active_humans::day'],
COUNT_STATS['active_users_audit:is_bot:day']]
tables = [aggregate_table]
subgroup_to_label: Dict[CountStat, Dict[Optional[str], str]] = {
subgroup_to_label = {
stats[0]: {None: '_1day'},
stats[1]: {None: '_15day'},
stats[2]: {'false': 'all_time'}}
stats[2]: {'false': 'all_time'}} # type: Dict[CountStat, Dict[Optional[str], str]]
labels_sort_function = None
include_empty_subgroups = True
elif chart_name == 'messages_sent_over_time':
@@ -240,14 +204,8 @@ def get_chart_data(request: HttpRequest, user_profile: UserProfile, chart_name:
{str(id): name for id, name in Client.objects.values_list('id', 'name')}}
labels_sort_function = sort_client_labels
include_empty_subgroups = False
elif chart_name == 'messages_read_over_time':
stats = [COUNT_STATS['messages_read::hour']]
tables = [aggregate_table, UserCount]
subgroup_to_label = {stats[0]: {None: 'read'}}
labels_sort_function = None
include_empty_subgroups = True
else:
raise JsonableError(_("Unknown chart name: {}").format(chart_name))
raise JsonableError(_("Unknown chart name: %s") % (chart_name,))
# Most likely someone using our API endpoint. The /stats page does not
# pass a start or end in its requests.
@@ -256,9 +214,8 @@ def get_chart_data(request: HttpRequest, user_profile: UserProfile, chart_name:
if end is not None:
end = convert_to_UTC(end)
if start is not None and end is not None and start > end:
raise JsonableError(_("Start time is later than end time. Start: {start}, End: {end}").format(
start=start, end=end,
))
raise JsonableError(_("Start time is later than end time. Start: %(start)s, End: %(end)s") %
{'start': start, 'end': end})
if realm is None:
# Note that this value is invalid for Remote tables; be
@@ -286,18 +243,17 @@ def get_chart_data(request: HttpRequest, user_profile: UserProfile, chart_name:
start = realm.date_created
if end is None:
end = max(last_successful_fill(stat.property) or
datetime.min.replace(tzinfo=timezone.utc) for stat in stats)
if start > end and (timezone_now() - start > MAX_TIME_FOR_FULL_ANALYTICS_GENERATION):
datetime.min.replace(tzinfo=timezone_utc) for stat in stats)
if start > end:
logging.warning("User from realm %s attempted to access /stats, but the computed "
"start time: %s (creation of realm or installation) is later than the computed "
"end time: %s (last successful analytics update). Is the "
"analytics cron job running?", realm.string_id, start, end)
"analytics cron job running?" % (realm.string_id, start, end))
raise JsonableError(_("No analytics data available. Please contact your server administrator."))
assert len({stat.frequency for stat in stats}) == 1
assert len(set([stat.frequency for stat in stats])) == 1
end_times = time_range(start, end, stats[0].frequency, min_length)
data: Dict[str, Any] = {'end_times': end_times, 'frequency': stats[0].frequency}
data = {'end_times': end_times, 'frequency': stats[0].frequency} # type: Dict[str, Any]
aggregation_level = {
InstallationCount: 'everyone',
@@ -342,7 +298,7 @@ def sort_by_totals(value_arrays: Dict[str, List[int]]) -> List[str]:
def sort_client_labels(data: Dict[str, Dict[str, List[int]]]) -> List[str]:
realm_order = sort_by_totals(data['everyone'])
user_order = sort_by_totals(data['user'])
label_sort_values: Dict[str, float] = {}
label_sort_values = {} # type: Dict[str, float]
for i, label in enumerate(realm_order):
label_sort_values[label] = i
for i, label in enumerate(user_order):
@@ -364,7 +320,7 @@ def table_filtered_to_id(table: Type[BaseCount], key_id: int) -> QuerySet:
elif table == RemoteRealmCount:
return RemoteRealmCount.objects.filter(realm_id=key_id)
else:
raise AssertionError(f"Unknown table: {table}")
raise AssertionError("Unknown table: %s" % (table,))
def client_label_map(name: str) -> str:
if name == "website":
@@ -386,7 +342,7 @@ def client_label_map(name: str) -> str:
return name
def rewrite_client_arrays(value_arrays: Dict[str, List[int]]) -> Dict[str, List[int]]:
mapped_arrays: Dict[str, List[int]] = {}
mapped_arrays = {} # type: Dict[str, List[int]]
for label, array in value_arrays.items():
mapped_label = client_label_map(label)
if mapped_label in mapped_arrays:
@@ -404,7 +360,7 @@ def get_time_series_by_subgroup(stat: CountStat,
include_empty_subgroups: bool) -> Dict[str, List[int]]:
queryset = table_filtered_to_id(table, key_id).filter(property=stat.property) \
.values_list('subgroup', 'end_time', 'value')
value_dicts: Dict[Optional[str], Dict[datetime, int]] = defaultdict(lambda: defaultdict(int))
value_dicts = defaultdict(lambda: defaultdict(int)) # type: Dict[Optional[str], Dict[datetime, int]]
for subgroup, end_time, value in queryset:
value_dicts[subgroup][end_time] = value
value_arrays = {}
@@ -422,7 +378,7 @@ def get_time_series_by_subgroup(stat: CountStat,
eastern_tz = pytz.timezone('US/Eastern')
def make_table(title: str, cols: Sequence[str], rows: Sequence[Any], has_row_class: bool = False) -> str:
def make_table(title: str, cols: List[str], rows: List[Any], has_row_class: bool=False) -> str:
if not has_row_class:
def fix_row(row: Any) -> Dict[str, Any]:
@@ -433,7 +389,7 @@ def make_table(title: str, cols: Sequence[str], rows: Sequence[Any], has_row_cla
content = loader.render_to_string(
'analytics/ad_hoc_query.html',
dict(data=data),
dict(data=data)
)
return content
@@ -448,7 +404,7 @@ def dictfetchall(cursor: connection.cursor) -> List[Dict[str, Any]]:
def get_realm_day_counts() -> Dict[str, Dict[str, str]]:
query = SQL('''
query = '''
select
r.string_id,
(now()::date - date_sent::date) age,
@@ -469,13 +425,13 @@ def get_realm_day_counts() -> Dict[str, Dict[str, str]]:
order by
r.string_id,
age
''')
'''
cursor = connection.cursor()
cursor.execute(query)
rows = dictfetchall(cursor)
cursor.close()
counts: Dict[str, Dict[int, int]] = defaultdict(dict)
counts = defaultdict(dict) # type: Dict[str, Dict[int, int]]
for row in rows:
counts[row['string_id']][row['age']] = row['cnt']
@@ -495,7 +451,7 @@ def get_realm_day_counts() -> Dict[str, Dict[str, str]]:
else:
good_bad = 'neutral'
return f'<td class="number {good_bad}">{cnt}</td>'
return '<td class="number %s">%s</td>' % (good_bad, cnt)
cnts = (format_count(raw_cnts[0], 'neutral')
+ ''.join(map(format_count, raw_cnts[1:])))
@@ -509,7 +465,7 @@ def get_plan_name(plan_type: int) -> str:
def realm_summary_table(realm_minutes: Dict[str, float]) -> str:
now = timezone_now()
query = SQL('''
query = '''
SELECT
realm.string_id,
realm.date_created,
@@ -611,7 +567,7 @@ def realm_summary_table(realm_minutes: Dict[str, float]) -> str:
last_visit > now() - interval '2 week'
)
ORDER BY dau_count DESC, string_id ASC
''')
'''
cursor = connection.cursor()
cursor.execute(query)
@@ -619,10 +575,10 @@ def realm_summary_table(realm_minutes: Dict[str, float]) -> str:
cursor.close()
# Fetch all the realm administrator users
realm_admins: Dict[str, List[str]] = defaultdict(list)
realm_admins = defaultdict(list) # type: Dict[str, List[str]]
for up in UserProfile.objects.select_related("realm").filter(
role=UserProfile.ROLE_REALM_ADMINISTRATOR,
is_active=True,
is_active=True
):
realm_admins[up.realm.string_id].append(up.delivery_email)
@@ -661,7 +617,7 @@ def realm_summary_table(realm_minutes: Dict[str, float]) -> str:
total_hours += hours
row['hours'] = str(int(hours))
try:
row['hours_per_user'] = '{:.1f}'.format(hours / row['dau_count'])
row['hours_per_user'] = '%.1f' % (hours / row['dau_count'],)
except Exception:
pass
@@ -706,7 +662,7 @@ def realm_summary_table(realm_minutes: Dict[str, float]) -> str:
content = loader.render_to_string(
'analytics/realm_summary_table.html',
dict(rows=rows, num_active_sites=num_active_sites,
now=now.strftime('%Y-%m-%dT%H:%M:%SZ')),
now=now.strftime('%Y-%m-%dT%H:%M:%SZ'))
)
return content
@@ -720,18 +676,18 @@ def user_activity_intervals() -> Tuple[mark_safe, Dict[str, float]]:
all_intervals = UserActivityInterval.objects.filter(
end__gte=day_start,
start__lte=day_end,
start__lte=day_end
).select_related(
'user_profile',
'user_profile__realm',
'user_profile__realm'
).only(
'start',
'end',
'user_profile__delivery_email',
'user_profile__realm__string_id',
'user_profile__realm__string_id'
).order_by(
'user_profile__realm__string_id',
'user_profile__delivery_email',
'user_profile__delivery_email'
)
by_string_id = lambda row: row.user_profile.realm.string_id
@@ -741,7 +697,7 @@ def user_activity_intervals() -> Tuple[mark_safe, Dict[str, float]]:
for string_id, realm_intervals in itertools.groupby(all_intervals, by_string_id):
realm_duration = timedelta(0)
output += f'<hr>{string_id}\n'
output += '<hr>%s\n' % (string_id,)
for email, intervals in itertools.groupby(realm_intervals, by_email):
duration = timedelta(0)
for interval in intervals:
@@ -751,13 +707,13 @@ def user_activity_intervals() -> Tuple[mark_safe, Dict[str, float]]:
total_duration += duration
realm_duration += duration
output += f" {email:<37}{duration}\n"
output += " %-*s%s\n" % (37, email, duration)
realm_minutes[string_id] = realm_duration.total_seconds() / 60
output += f"\nTotal Duration: {total_duration}\n"
output += f"\nTotal Duration in minutes: {total_duration.total_seconds() / 60.}\n"
output += f"Total Duration amortized to a month: {total_duration.total_seconds() * 30. / 60.}"
output += "\nTotal Duration: %s\n" % (total_duration,)
output += "\nTotal Duration in minutes: %s\n" % (total_duration.total_seconds() / 60.,)
output += "Total Duration amortized to a month: %s" % (total_duration.total_seconds() * 30. / 60.,)
content = mark_safe('<pre>' + output + '</pre>')
return content, realm_minutes
@@ -767,10 +723,10 @@ def sent_messages_report(realm: str) -> str:
cols = [
'Date',
'Humans',
'Bots',
'Bots'
]
query = SQL('''
query = '''
select
series.day::date,
humans.cnt,
@@ -820,7 +776,7 @@ def sent_messages_report(realm: str) -> str:
date_sent::date
) bots on
series.day = bots.date_sent
''')
'''
cursor = connection.cursor()
cursor.execute(query, [realm, realm])
rows = cursor.fetchall()
@@ -829,8 +785,8 @@ def sent_messages_report(realm: str) -> str:
return make_table(title, cols, rows)
def ad_hoc_queries() -> List[Dict[str, str]]:
def get_page(query: Composable, cols: Sequence[str], title: str,
totals_columns: Sequence[int]=[]) -> Dict[str, str]:
def get_page(query: str, cols: List[str], title: str,
totals_columns: List[int]=[]) -> Dict[str, str]:
cursor = connection.cursor()
cursor.execute(query)
rows = cursor.fetchall()
@@ -865,7 +821,7 @@ def ad_hoc_queries() -> List[Dict[str, str]]:
return dict(
content=content,
title=title,
title=title
)
pages = []
@@ -873,9 +829,9 @@ def ad_hoc_queries() -> List[Dict[str, str]]:
###
for mobile_type in ['Android', 'ZulipiOS']:
title = f'{mobile_type} usage'
title = '%s usage' % (mobile_type,)
query = SQL('''
query = '''
select
realm.string_id,
up.id user_id,
@@ -887,20 +843,18 @@ def ad_hoc_queries() -> List[Dict[str, str]]:
join zerver_userprofile up on up.id = ua.user_profile_id
join zerver_realm realm on realm.id = up.realm_id
where
client.name like {mobile_type}
client.name like '%s'
group by string_id, up.id, client.name
having max(last_visit) > now() - interval '2 week'
order by string_id, up.id, client.name
''').format(
mobile_type=Literal(mobile_type),
)
''' % (mobile_type,)
cols = [
'Realm',
'User id',
'Name',
'Hits',
'Last time',
'Last time'
]
pages.append(get_page(query, cols, title))
@@ -909,7 +863,7 @@ def ad_hoc_queries() -> List[Dict[str, str]]:
title = 'Desktop users'
query = SQL('''
query = '''
select
realm.string_id,
client.name,
@@ -924,13 +878,13 @@ def ad_hoc_queries() -> List[Dict[str, str]]:
group by string_id, client.name
having max(last_visit) > now() - interval '2 week'
order by string_id, client.name
''')
'''
cols = [
'Realm',
'Client',
'Hits',
'Last time',
'Last time'
]
pages.append(get_page(query, cols, title))
@@ -939,7 +893,7 @@ def ad_hoc_queries() -> List[Dict[str, str]]:
title = 'Integrations by realm'
query = SQL('''
query = '''
select
realm.string_id,
case
@@ -962,13 +916,13 @@ def ad_hoc_queries() -> List[Dict[str, str]]:
group by string_id, client_name
having max(last_visit) > now() - interval '2 week'
order by string_id, client_name
''')
'''
cols = [
'Realm',
'Client',
'Hits',
'Last time',
'Last time'
]
pages.append(get_page(query, cols, title))
@@ -977,7 +931,7 @@ def ad_hoc_queries() -> List[Dict[str, str]]:
title = 'Integrations by client'
query = SQL('''
query = '''
select
case
when query like '%%external%%' then split_part(query, '/', 5)
@@ -1000,20 +954,20 @@ def ad_hoc_queries() -> List[Dict[str, str]]:
group by client_name, string_id
having max(last_visit) > now() - interval '2 week'
order by client_name, string_id
''')
'''
cols = [
'Client',
'Realm',
'Hits',
'Last time',
'Last time'
]
pages.append(get_page(query, cols, title))
title = 'Remote Zulip servers'
query = SQL('''
query = '''
with icount as (
select
server_id,
@@ -1040,7 +994,7 @@ def ad_hoc_queries() -> List[Dict[str, str]]:
left join icount on icount.server_id = rserver.id
left join remote_push_devices on remote_push_devices.server_id = rserver.id
order by max_value DESC NULLS LAST, push_user_count DESC NULLS LAST
''')
'''
cols = [
'ID',
@@ -1059,8 +1013,8 @@ def ad_hoc_queries() -> List[Dict[str, str]]:
@require_server_admin
@has_request_variables
def get_activity(request: HttpRequest) -> HttpResponse:
duration_content, realm_minutes = user_activity_intervals()
counts_content: str = realm_summary_table(realm_minutes)
duration_content, realm_minutes = user_activity_intervals() # type: Tuple[mark_safe, Dict[str, float]]
counts_content = realm_summary_table(realm_minutes) # type: str
data = [
('Counts', counts_content),
('Durations', duration_content),
@@ -1086,6 +1040,13 @@ def get_confirmations(types: List[int], object_ids: List[int],
realm = confirmation.realm
content_object = confirmation.content_object
if realm is not None:
realm_host = realm.host
elif isinstance(content_object, Realm):
realm_host = content_object.host
else:
realm_host = hostname
type = confirmation.type
days_to_activate = _properties[type].validity_in_days
expiry_date = confirmation.date_sent + timedelta(days=days_to_activate)
@@ -1103,7 +1064,7 @@ def get_confirmations(types: List[int], object_ids: List[int],
else:
expires_in = "Expired"
url = confirmation_url(confirmation.confirmation_key, realm, type)
url = confirmation_url(confirmation.confirmation_key, realm_host, type)
confirmation_dicts.append({"object": confirmation.content_object,
"url": url, "type": type, "link_status": link_status,
"expires_in": expires_in})
@@ -1111,50 +1072,43 @@ def get_confirmations(types: List[int], object_ids: List[int],
@require_server_admin
def support(request: HttpRequest) -> HttpResponse:
context: Dict[str, Any] = {}
context = {} # type: Dict[str, Any]
if settings.BILLING_ENABLED and request.method == "POST":
# We check that request.POST only has two keys in it: The
# realm_id and a field to change.
keys = set(request.POST.keys())
if "csrfmiddlewaretoken" in keys:
keys.remove("csrfmiddlewaretoken")
assert(len(keys) == 2)
realm_id = request.POST.get("realm_id")
realm_id = request.POST.get("realm_id", None)
realm = Realm.objects.get(id=realm_id)
if request.POST.get("plan_type", None) is not None:
new_plan_type = int(request.POST.get("plan_type"))
new_plan_type = request.POST.get("plan_type", None)
if new_plan_type is not None:
new_plan_type = int(new_plan_type)
current_plan_type = realm.plan_type
do_change_plan_type(realm, new_plan_type)
msg = f"Plan type of {realm.name} changed from {get_plan_name(current_plan_type)} to {get_plan_name(new_plan_type)} "
msg = "Plan type of {} changed from {} to {} ".format(realm.name,
get_plan_name(current_plan_type),
get_plan_name(new_plan_type))
context["message"] = msg
elif request.POST.get("discount", None) is not None:
new_discount = Decimal(request.POST.get("discount"))
new_discount = request.POST.get("discount", None)
if new_discount is not None:
new_discount = Decimal(new_discount)
current_discount = get_discount_for_realm(realm)
attach_discount_to_realm(realm, new_discount)
msg = f"Discount of {realm.name} changed to {new_discount} from {current_discount} "
msg = "Discount of {} changed to {} from {} ".format(realm.name, new_discount, current_discount)
context["message"] = msg
elif request.POST.get("status", None) is not None:
status = request.POST.get("status")
status = request.POST.get("status", None)
if status is not None:
if status == "active":
do_send_realm_reactivation_email(realm)
context["message"] = f"Realm reactivation email sent to admins of {realm.name}."
context["message"] = "Realm reactivation email sent to admins of {}.".format(realm.name)
elif status == "deactivated":
do_deactivate_realm(realm, request.user)
context["message"] = f"{realm.name} deactivated."
elif request.POST.get("sponsorship_pending", None) is not None:
sponsorship_pending = request.POST.get("sponsorship_pending")
if sponsorship_pending == "true":
update_sponsorship_status(realm, True)
context["message"] = f"{realm.name} marked as pending sponsorship."
elif sponsorship_pending == "false":
update_sponsorship_status(realm, False)
context["message"] = f"{realm.name} is no longer pending sponsorship."
elif request.POST.get("scrub_realm", None) is not None:
if request.POST.get("scrub_realm") == "scrub_realm":
do_scrub_realm(realm, acting_user=request.user)
context["message"] = f"{realm.name} scrubbed."
context["message"] = "{} deactivated.".format(realm.name)
scrub_realm = request.POST.get("scrub_realm", None)
if scrub_realm is not None:
if scrub_realm == "scrub_realm":
do_scrub_realm(realm)
context["message"] = "{} scrubbed.".format(realm.name)
query = request.GET.get("q", None)
if query:
@@ -1170,7 +1124,7 @@ def support(request: HttpRequest) -> HttpResponse:
hostname = parse_result.hostname
assert hostname is not None
if parse_result.port:
hostname = f"{hostname}:{parse_result.port}"
hostname = "{}:{}".format(hostname, parse_result.port)
subdomain = get_subdomain_from_hostname(hostname)
try:
realms.add(get_realm(subdomain))
@@ -1179,12 +1133,9 @@ def support(request: HttpRequest) -> HttpResponse:
except ValidationError:
pass
for realm in realms:
realm.customer = get_customer_by_realm(realm)
context["realms"] = realms
confirmations: List[Dict[str, Any]] = []
confirmations = [] # type: List[Dict[str, Any]]
preregistration_users = PreregistrationUser.objects.filter(email__in=key_words)
confirmations += get_confirmations([Confirmation.USER_REGISTRATION, Confirmation.INVITATION,
@@ -1199,8 +1150,7 @@ def support(request: HttpRequest) -> HttpResponse:
context["confirmations"] = confirmations
def realm_admin_emails(realm: Realm) -> str:
return ", ".join(realm.get_human_admin_users().order_by('delivery_email').values_list(
"delivery_email", flat=True))
return ", ".join(realm.get_human_admin_users().values_list("delivery_email", flat=True))
context["realm_admin_emails"] = realm_admin_emails
context["get_discount_for_realm"] = get_discount_for_realm
@@ -1221,7 +1171,7 @@ def get_user_activity_records_for_realm(realm: str, is_bot: bool) -> QuerySet:
records = UserActivity.objects.filter(
user_profile__realm__string_id=realm,
user_profile__is_active=True,
user_profile__is_bot=is_bot,
user_profile__is_bot=is_bot
)
records = records.order_by("user_profile__delivery_email", "-last_visit")
records = records.select_related('user_profile', 'client').only(*fields)
@@ -1233,11 +1183,11 @@ def get_user_activity_records_for_email(email: str) -> List[QuerySet]:
'query',
'client__name',
'count',
'last_visit',
'last_visit'
]
records = UserActivity.objects.filter(
user_profile__delivery_email=email,
user_profile__delivery_email=email
)
records = records.order_by("-last_visit")
records = records.select_related('user_profile', 'client').only(*fields)
@@ -1248,7 +1198,7 @@ def raw_user_activity_table(records: List[QuerySet]) -> str:
'query',
'client',
'count',
'last_visit',
'last_visit'
]
def row(record: QuerySet) -> List[Any]:
@@ -1256,7 +1206,7 @@ def raw_user_activity_table(records: List[QuerySet]) -> str:
record.query,
record.client.name,
record.count,
format_date_for_activity_reports(record.last_visit),
format_date_for_activity_reports(record.last_visit)
]
rows = list(map(row, records))
@@ -1269,19 +1219,19 @@ def get_user_activity_summary(records: List[QuerySet]) -> Dict[str, Dict[str, An
#: We could use something like:
# `Union[Dict[str, Dict[str, int]], Dict[str, Dict[str, datetime]]]`
#: but that would require this long `Union` to carry on throughout inner functions.
summary: Dict[str, Dict[str, Any]] = {}
summary = {} # type: Dict[str, Dict[str, Any]]
def update(action: str, record: QuerySet) -> None:
if action not in summary:
summary[action] = dict(
count=record.count,
last_visit=record.last_visit,
last_visit=record.last_visit
)
else:
summary[action]['count'] += record.count
summary[action]['last_visit'] = max(
summary[action]['last_visit'],
record.last_visit,
record.last_visit
)
if records:
@@ -1321,25 +1271,25 @@ def format_date_for_activity_reports(date: Optional[datetime]) -> str:
def user_activity_link(email: str) -> mark_safe:
url_name = 'analytics.views.get_user_activity'
url = reverse(url_name, kwargs=dict(email=email))
email_link = f'<a href="{url}">{email}</a>'
email_link = '<a href="%s">%s</a>' % (url, email)
return mark_safe(email_link)
def realm_activity_link(realm_str: str) -> mark_safe:
url_name = 'analytics.views.get_realm_activity'
url = reverse(url_name, kwargs=dict(realm_str=realm_str))
realm_link = f'<a href="{url}">{realm_str}</a>'
realm_link = '<a href="%s">%s</a>' % (url, realm_str)
return mark_safe(realm_link)
def realm_stats_link(realm_str: str) -> mark_safe:
url_name = 'analytics.views.stats_for_realm'
url = reverse(url_name, kwargs=dict(realm_str=realm_str))
stats_link = f'<a href="{url}"><i class="fa fa-pie-chart"></i>{realm_str}</a>'
stats_link = '<a href="{}"><i class="fa fa-pie-chart"></i>{}</a>'.format(url, realm_str)
return mark_safe(stats_link)
def remote_installation_stats_link(server_id: int, hostname: str) -> mark_safe:
url_name = 'analytics.views.stats_for_remote_installation'
url = reverse(url_name, kwargs=dict(remote_server_id=server_id))
stats_link = f'<a href="{url}"><i class="fa fa-pie-chart"></i>{hostname}</a>'
stats_link = '<a href="{}"><i class="fa fa-pie-chart"></i>{}</a>'.format(url, hostname)
return mark_safe(stats_link)
def realm_client_table(user_summaries: Dict[str, Dict[str, Dict[str, Any]]]) -> str:
@@ -1480,13 +1430,13 @@ def realm_user_summary_table(all_records: List[QuerySet],
@require_server_admin
def get_realm_activity(request: HttpRequest, realm_str: str) -> HttpResponse:
data: List[Tuple[str, str]] = []
all_user_records: Dict[str, Any] = {}
data = [] # type: List[Tuple[str, str]]
all_user_records = {} # type: Dict[str, Any]
try:
admins = Realm.objects.get(string_id=realm_str).get_human_admin_users()
except Realm.DoesNotExist:
return HttpResponseNotFound(f"Realm {realm_str} does not exist")
return HttpResponseNotFound("Realm %s does not exist" % (realm_str,))
admin_emails = {admin.delivery_email for admin in admins}
@@ -1517,7 +1467,7 @@ def get_realm_activity(request: HttpRequest, realm_str: str) -> HttpResponse:
def get_user_activity(request: HttpRequest, email: str) -> HttpResponse:
records = get_user_activity_records_for_email(email)
data: List[Tuple[str, str]] = []
data = [] # type: List[Tuple[str, str]]
user_summary = get_user_activity_summary(records)
content = user_activity_summary_table(user_summary)

View File

@@ -4,7 +4,6 @@ module.exports = {
"@babel/preset-env",
{
corejs: 3,
loose: true, // Loose mode for…of loops are 5× faster in Firefox
useBuiltIns: "usage",
},
],
@@ -12,7 +11,7 @@ module.exports = {
],
plugins: [
"@babel/proposal-class-properties",
["@babel/plugin-proposal-unicode-property-regex", {useUnicodeFlag: false}],
["@babel/plugin-proposal-unicode-property-regex", { useUnicodeFlag: false }],
],
sourceType: "unambiguous",
};

View File

@@ -1,3 +1,5 @@
# -*- coding: utf-8 -*-
# Copyright: (c) 2008, Jarek Zgoda <jarek.zgoda@gmail.com>
# Permission is hereby granted, free of charge, to any person obtaining a

View File

@@ -1,5 +1,6 @@
# -*- coding: utf-8 -*-
from django.db import models, migrations
import django.db.models.deletion
from django.db import migrations, models
class Migration(migrations.Migration):

View File

@@ -1,5 +1,6 @@
# -*- coding: utf-8 -*-
from django.db import models, migrations
import django.utils.timezone
from django.db import migrations, models
class Migration(migrations.Migration):

View File

@@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.10.4 on 2017-01-17 09:16
from django.db import migrations

View File

@@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.2 on 2017-07-08 04:23
from django.db import migrations, models

View File

@@ -1,6 +1,9 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.6 on 2017-11-30 00:13
import django.db.models.deletion
from __future__ import unicode_literals
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):

View File

@@ -1,4 +1,6 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.6 on 2018-01-29 18:39
from __future__ import unicode_literals
from django.db import migrations, models

View File

@@ -1,37 +0,0 @@
# Generated by Django 2.2.10 on 2020-03-27 09:02
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('confirmation', '0006_realmcreationkey_presume_email_valid'),
]
operations = [
migrations.AlterField(
model_name='confirmation',
name='confirmation_key',
field=models.CharField(db_index=True, max_length=40),
),
migrations.AlterField(
model_name='confirmation',
name='date_sent',
field=models.DateTimeField(db_index=True),
),
migrations.AlterField(
model_name='confirmation',
name='object_id',
field=models.PositiveIntegerField(db_index=True),
),
migrations.AlterField(
model_name='realmcreationkey',
name='creation_key',
field=models.CharField(db_index=True, max_length=40, verbose_name='activation key'),
),
migrations.AlterUniqueTogether(
name='confirmation',
unique_together={('type', 'confirmation_key')},
),
]

View File

@@ -1,24 +1,26 @@
# -*- coding: utf-8 -*-
# Copyright: (c) 2008, Jarek Zgoda <jarek.zgoda@gmail.com>
__revision__ = '$Id: models.py 28 2009-10-22 15:03:02Z jarek.zgoda $'
import datetime
import string
from random import SystemRandom
from typing import Mapping, Optional, Union
from urllib.parse import urljoin
from django.conf import settings
from django.contrib.contenttypes.fields import GenericForeignKey
from django.contrib.contenttypes.models import ContentType
import datetime
from django.db import models
from django.db.models import CASCADE
from django.urls import reverse
from django.conf import settings
from django.contrib.contenttypes.models import ContentType
from django.contrib.contenttypes.fields import GenericForeignKey
from django.http import HttpRequest, HttpResponse
from django.shortcuts import render
from django.urls import reverse
from django.utils.timezone import now as timezone_now
from zerver.models import EmailChangeStatus, MultiuseInvite, PreregistrationUser, Realm, UserProfile
from zerver.models import PreregistrationUser, EmailChangeStatus, MultiuseInvite, \
UserProfile, Realm
from random import SystemRandom
import string
from typing import Dict, Optional, Union
class ConfirmationKeyException(Exception):
WRONG_LENGTH = 1
@@ -43,8 +45,7 @@ def generate_key() -> str:
ConfirmationObjT = Union[MultiuseInvite, PreregistrationUser, EmailChangeStatus]
def get_object_from_key(confirmation_key: str,
confirmation_type: int,
activate_object: bool=True) -> ConfirmationObjT:
confirmation_type: int) -> ConfirmationObjT:
# Confirmation keys used to be 40 characters
if len(confirmation_key) not in (24, 40):
raise ConfirmationKeyException(ConfirmationKeyException.WRONG_LENGTH)
@@ -59,14 +60,14 @@ def get_object_from_key(confirmation_key: str,
raise ConfirmationKeyException(ConfirmationKeyException.EXPIRED)
obj = confirmation.content_object
if activate_object and hasattr(obj, "status"):
if hasattr(obj, "status"):
obj.status = getattr(settings, 'STATUS_ACTIVE', 1)
obj.save(update_fields=['status'])
return obj
def create_confirmation_link(obj: ContentType,
def create_confirmation_link(obj: ContentType, host: str,
confirmation_type: int,
url_args: Mapping[str, str] = {}) -> str:
url_args: Optional[Dict[str, str]]=None) -> str:
key = generate_key()
realm = None
if hasattr(obj, 'realm'):
@@ -76,25 +77,24 @@ def create_confirmation_link(obj: ContentType,
Confirmation.objects.create(content_object=obj, date_sent=timezone_now(), confirmation_key=key,
realm=realm, type=confirmation_type)
return confirmation_url(key, realm, confirmation_type, url_args)
return confirmation_url(key, host, confirmation_type, url_args)
def confirmation_url(confirmation_key: str, realm: Optional[Realm],
def confirmation_url(confirmation_key: str, host: str,
confirmation_type: int,
url_args: Mapping[str, str] = {}) -> str:
url_args = dict(url_args)
url_args: Optional[Dict[str, str]]=None) -> str:
if url_args is None:
url_args = {}
url_args['confirmation_key'] = confirmation_key
return urljoin(
settings.ROOT_DOMAIN_URI if realm is None else realm.uri,
reverse(_properties[confirmation_type].url_name, kwargs=url_args),
)
return '%s%s%s' % (settings.EXTERNAL_URI_SCHEME, host,
reverse(_properties[confirmation_type].url_name, kwargs=url_args))
class Confirmation(models.Model):
content_type = models.ForeignKey(ContentType, on_delete=CASCADE)
object_id: int = models.PositiveIntegerField(db_index=True)
object_id = models.PositiveIntegerField() # type: int
content_object = GenericForeignKey('content_type', 'object_id')
date_sent: datetime.datetime = models.DateTimeField(db_index=True)
confirmation_key: str = models.CharField(max_length=40, db_index=True)
realm: Optional[Realm] = models.ForeignKey(Realm, null=True, on_delete=CASCADE)
date_sent = models.DateTimeField() # type: datetime.datetime
confirmation_key = models.CharField(max_length=40) # type: str
realm = models.ForeignKey(Realm, null=True, on_delete=CASCADE) # type: Optional[Realm]
# The following list is the set of valid types
USER_REGISTRATION = 1
@@ -105,13 +105,10 @@ class Confirmation(models.Model):
MULTIUSE_INVITE = 6
REALM_CREATION = 7
REALM_REACTIVATION = 8
type: int = models.PositiveSmallIntegerField()
type = models.PositiveSmallIntegerField() # type: int
def __str__(self) -> str:
return f'<Confirmation: {self.content_object}>'
class Meta:
unique_together = ("type", "confirmation_key")
return '<Confirmation: %s>' % (self.content_object,)
class ConfirmationType:
def __init__(self, url_name: str,
@@ -138,7 +135,7 @@ def one_click_unsubscribe_link(user_profile: UserProfile, email_type: str) -> st
Generate a unique link that a logged-out user can visit to unsubscribe from
Zulip e-mails without having to first log in.
"""
return create_confirmation_link(user_profile,
return create_confirmation_link(user_profile, user_profile.realm.host,
Confirmation.UNSUBSCRIBE,
url_args = {'email_type': email_type})
@@ -168,18 +165,18 @@ def generate_realm_creation_url(by_admin: bool=False) -> str:
RealmCreationKey.objects.create(creation_key=key,
date_created=timezone_now(),
presume_email_valid=by_admin)
return urljoin(
settings.ROOT_DOMAIN_URI,
reverse('zerver.views.create_realm', kwargs={'creation_key': key}),
)
return '%s%s%s' % (settings.EXTERNAL_URI_SCHEME,
settings.EXTERNAL_HOST,
reverse('zerver.views.create_realm',
kwargs={'creation_key': key}))
class RealmCreationKey(models.Model):
creation_key = models.CharField('activation key', db_index=True, max_length=40)
creation_key = models.CharField('activation key', max_length=40)
date_created = models.DateTimeField('created', default=timezone_now)
# True just if we should presume the email address the user enters
# is theirs, and skip sending mail to it to confirm that.
presume_email_valid: bool = models.BooleanField(default=False)
presume_email_valid = models.BooleanField(default=False) # type: bool
class Invalid(Exception):
pass

View File

@@ -1,3 +1,5 @@
# -*- coding: utf-8 -*-
# Copyright: (c) 2008, Jarek Zgoda <jarek.zgoda@gmail.com>
__revision__ = '$Id: settings.py 12 2008-11-23 19:38:52Z jarek.zgoda $'

View File

@@ -1,31 +1,25 @@
from datetime import datetime
from decimal import Decimal
from functools import wraps
import logging
import math
import os
from datetime import datetime, timedelta
from decimal import Decimal
from functools import wraps
from typing import Callable, Dict, Optional, Tuple, TypeVar, cast
import stripe
from typing import Any, Callable, Dict, Optional, TypeVar, Tuple, cast
import ujson
from django.conf import settings
from django.core.signing import Signer
from django.db import transaction
from django.utils.timezone import now as timezone_now
from django.utils.translation import ugettext as _
from corporate.models import (
Customer,
CustomerPlan,
LicenseLedger,
get_current_plan_by_customer,
get_current_plan_by_realm,
get_customer_by_realm,
)
from django.conf import settings
from django.db import transaction
from django.utils.translation import ugettext as _
from django.utils.timezone import now as timezone_now
from django.core.signing import Signer
import stripe
from zerver.lib.logging_util import log_to_file
from zerver.lib.timestamp import datetime_to_timestamp, timestamp_to_datetime
from zerver.lib.utils import generate_random_token
from zerver.models import Realm, RealmAuditLog, UserProfile
from zerver.models import Realm, UserProfile, RealmAuditLog
from corporate.models import Customer, CustomerPlan, LicenseLedger, \
get_current_plan
from zproject.config import get_secret
STRIPE_PUBLISHABLE_KEY = get_secret('stripe_publishable_key')
@@ -39,10 +33,9 @@ billing_logger = logging.getLogger('corporate.stripe')
log_to_file(billing_logger, BILLING_LOG_PATH)
log_to_file(logging.getLogger('stripe'), BILLING_LOG_PATH)
CallableT = TypeVar('CallableT', bound=Callable[..., object])
CallableT = TypeVar('CallableT', bound=Callable[..., Any])
MIN_INVOICED_LICENSES = 30
MAX_INVOICED_LICENSES = 1000
DEFAULT_INVOICE_DAYS_UNTIL_DUE = 30
def get_latest_seat_count(realm: Realm) -> int:
@@ -87,13 +80,9 @@ def next_month(billing_cycle_anchor: datetime, dt: datetime) -> datetime:
if 20 < (proposed_next_month - dt).days < 40:
return proposed_next_month
raise AssertionError('Something wrong in next_month calculation with '
f'billing_cycle_anchor: {billing_cycle_anchor}, dt: {dt}')
'billing_cycle_anchor: %s, dt: %s' % (billing_cycle_anchor, dt))
def start_of_next_billing_cycle(plan: CustomerPlan, event_time: datetime) -> datetime:
if plan.status == CustomerPlan.FREE_TRIAL:
assert(plan.next_invoice_date is not None) # for mypy
return plan.next_invoice_date
months_per_period = {
CustomerPlan.ANNUAL: 12,
CustomerPlan.MONTHLY: 1,
@@ -125,26 +114,17 @@ def next_invoice_date(plan: CustomerPlan) -> Optional[datetime]:
def renewal_amount(plan: CustomerPlan, event_time: datetime) -> int: # nocoverage: TODO
if plan.fixed_price is not None:
return plan.fixed_price
new_plan, last_ledger_entry = make_end_of_cycle_updates_if_needed(plan, event_time)
last_ledger_entry = make_end_of_cycle_updates_if_needed(plan, event_time)
if last_ledger_entry is None:
return 0
if last_ledger_entry.licenses_at_next_renewal is None:
return 0
if new_plan is not None:
plan = new_plan
assert(plan.price_per_license is not None) # for mypy
return plan.price_per_license * last_ledger_entry.licenses_at_next_renewal
def get_idempotency_key(ledger_entry: LicenseLedger) -> Optional[str]:
if settings.TEST_SUITE:
return None
return f'ledger_entry:{ledger_entry.id}' # nocoverage
class BillingError(Exception):
# error messages
CONTACT_SUPPORT = _("Something went wrong. Please contact {email}.").format(
email=settings.ZULIP_ADMINISTRATOR,
)
CONTACT_SUPPORT = _("Something went wrong. Please contact %s.") % (settings.ZULIP_ADMINISTRATOR,)
TRY_RELOADING = _("Something went wrong. Please reload the page.")
# description is used only for tests
@@ -160,7 +140,7 @@ class StripeConnectionError(BillingError):
def catch_stripe_errors(func: CallableT) -> CallableT:
@wraps(func)
def wrapped(*args: object, **kwargs: object) -> object:
def wrapped(*args: Any, **kwargs: Any) -> Any:
if settings.DEVELOPMENT and not settings.TEST_SUITE: # nocoverage
if STRIPE_PUBLISHABLE_KEY is None:
raise BillingError('missing stripe config', "Missing Stripe config. "
@@ -172,19 +152,18 @@ def catch_stripe_errors(func: CallableT) -> CallableT:
# https://stripe.com/docs/error-codes gives a more detailed set of error codes
except stripe.error.StripeError as e:
err = e.json_body.get('error', {})
billing_logger.error(
"Stripe error: %s %s %s %s",
e.http_status, err.get('type'), err.get('code'), err.get('param'),
)
billing_logger.error("Stripe error: %s %s %s %s" % (
e.http_status, err.get('type'), err.get('code'), err.get('param')))
if isinstance(e, stripe.error.CardError):
# TODO: Look into i18n for this
raise StripeCardError('card error', err.get('message'))
if isinstance(e, (stripe.error.RateLimitError, stripe.error.APIConnectionError)): # nocoverage TODO
if isinstance(e, stripe.error.RateLimitError) or \
isinstance(e, stripe.error.APIConnectionError): # nocoverage TODO
raise StripeConnectionError(
'stripe connection error',
_("Something went wrong. Please wait a few seconds and try again."))
raise BillingError('other stripe error', BillingError.CONTACT_SUPPORT)
return cast(CallableT, wrapped)
return wrapped # type: ignore # https://github.com/python/mypy/issues/1927
@catch_stripe_errors
def stripe_get_customer(stripe_customer_id: str) -> stripe.Customer:
@@ -198,7 +177,7 @@ def do_create_stripe_customer(user: UserProfile, stripe_token: Optional[str]=Non
# bad thing that will happen is that we will create an extra stripe
# customer that we can delete or ignore.
stripe_customer = stripe.Customer.create(
description=f"{realm.string_id} ({realm.name})",
description="%s (%s)" % (realm.string_id, realm.name),
email=user.delivery_email,
metadata={'realm_id': realm.id, 'realm_str': realm.string_id},
source=stripe_token)
@@ -220,10 +199,7 @@ def do_create_stripe_customer(user: UserProfile, stripe_token: Optional[str]=Non
@catch_stripe_errors
def do_replace_payment_source(user: UserProfile, stripe_token: str,
pay_invoices: bool=False) -> stripe.Customer:
customer = get_customer_by_realm(user.realm)
assert(customer is not None) # for mypy
stripe_customer = stripe_get_customer(customer.stripe_customer_id)
stripe_customer = stripe_get_customer(Customer.objects.get(realm=user.realm).stripe_customer_id)
stripe_customer.source = stripe_token
# Deletes existing card: https://stripe.com/docs/api#update_customer-source
updated_stripe_customer = stripe.Customer.save(stripe_customer)
@@ -234,7 +210,7 @@ def do_replace_payment_source(user: UserProfile, stripe_token: str,
for stripe_invoice in stripe.Invoice.list(
billing='charge_automatically', customer=stripe_customer.id, status='open'):
# The user will get either a receipt or a "failed payment" email, but the in-app
# messaging could be clearer here (e.g. it could explicitly tell the user that there
# messaging could be clearer here (e.g. it could explictly tell the user that there
# were payment(s) and that they succeeded or failed).
# Worth fixing if we notice that a lot of cards end up failing at this step.
stripe.Invoice.pay(stripe_invoice)
@@ -242,77 +218,28 @@ def do_replace_payment_source(user: UserProfile, stripe_token: str,
# event_time should roughly be timezone_now(). Not designed to handle
# event_times in the past or future
@transaction.atomic
def make_end_of_cycle_updates_if_needed(plan: CustomerPlan,
event_time: datetime) -> Tuple[Optional[CustomerPlan], Optional[LicenseLedger]]:
event_time: datetime) -> Optional[LicenseLedger]:
last_ledger_entry = LicenseLedger.objects.filter(plan=plan).order_by('-id').first()
last_renewal = LicenseLedger.objects.filter(plan=plan, is_renewal=True) \
.order_by('-id').first().event_time
next_billing_cycle = start_of_next_billing_cycle(plan, last_renewal)
if next_billing_cycle <= event_time:
if plan.status == CustomerPlan.ACTIVE:
return None, LicenseLedger.objects.create(
return LicenseLedger.objects.create(
plan=plan, is_renewal=True, event_time=next_billing_cycle,
licenses=last_ledger_entry.licenses_at_next_renewal,
licenses_at_next_renewal=last_ledger_entry.licenses_at_next_renewal)
if plan.status == CustomerPlan.FREE_TRIAL:
plan.invoiced_through = last_ledger_entry
assert(plan.next_invoice_date is not None)
plan.billing_cycle_anchor = plan.next_invoice_date.replace(microsecond=0)
plan.status = CustomerPlan.ACTIVE
plan.save(update_fields=["invoiced_through", "billing_cycle_anchor", "status"])
return None, LicenseLedger.objects.create(
plan=plan, is_renewal=True, event_time=next_billing_cycle,
licenses=last_ledger_entry.licenses_at_next_renewal,
licenses_at_next_renewal=last_ledger_entry.licenses_at_next_renewal)
if plan.status == CustomerPlan.SWITCH_TO_ANNUAL_AT_END_OF_CYCLE:
if plan.fixed_price is not None: # nocoverage
raise NotImplementedError("Can't switch fixed priced monthly plan to annual.")
plan.status = CustomerPlan.ENDED
plan.save(update_fields=["status"])
discount = plan.customer.default_discount or plan.discount
_, _, _, price_per_license = compute_plan_parameters(
automanage_licenses=plan.automanage_licenses, billing_schedule=CustomerPlan.ANNUAL,
discount=plan.discount
)
new_plan = CustomerPlan.objects.create(
customer=plan.customer, billing_schedule=CustomerPlan.ANNUAL, automanage_licenses=plan.automanage_licenses,
charge_automatically=plan.charge_automatically, price_per_license=price_per_license,
discount=discount, billing_cycle_anchor=next_billing_cycle,
tier=plan.tier, status=CustomerPlan.ACTIVE, next_invoice_date=next_billing_cycle,
invoiced_through=None, invoicing_status=CustomerPlan.INITIAL_INVOICE_TO_BE_SENT,
)
new_plan_ledger_entry = LicenseLedger.objects.create(
plan=new_plan, is_renewal=True, event_time=next_billing_cycle,
licenses=last_ledger_entry.licenses_at_next_renewal,
licenses_at_next_renewal=last_ledger_entry.licenses_at_next_renewal
)
RealmAuditLog.objects.create(
realm=new_plan.customer.realm, event_time=event_time,
event_type=RealmAuditLog.CUSTOMER_SWITCHED_FROM_MONTHLY_TO_ANNUAL_PLAN,
extra_data=ujson.dumps({
"monthly_plan_id": plan.id,
"annual_plan_id": new_plan.id,
})
)
return new_plan, new_plan_ledger_entry
if plan.status == CustomerPlan.DOWNGRADE_AT_END_OF_CYCLE:
process_downgrade(plan)
return None, None
return None, last_ledger_entry
return None
return last_ledger_entry
# Returns Customer instead of stripe_customer so that we don't make a Stripe
# API call if there's nothing to update
def update_or_create_stripe_customer(user: UserProfile, stripe_token: Optional[str]=None) -> Customer:
realm = user.realm
customer = get_customer_by_realm(realm)
customer = Customer.objects.filter(realm=realm).first()
if customer is None or customer.stripe_customer_id is None:
return do_create_stripe_customer(user, stripe_token=stripe_token)
if stripe_token is not None:
@@ -321,8 +248,7 @@ def update_or_create_stripe_customer(user: UserProfile, stripe_token: Optional[s
def compute_plan_parameters(
automanage_licenses: bool, billing_schedule: int,
discount: Optional[Decimal],
free_trial: bool=False) -> Tuple[datetime, datetime, datetime, int]:
discount: Optional[Decimal]) -> Tuple[datetime, datetime, datetime, int]:
# Everything in Stripe is stored as timestamps with 1 second resolution,
# so standardize on 1 second resolution.
# TODO talk about leapseconds?
@@ -335,16 +261,13 @@ def compute_plan_parameters(
price_per_license = 800
period_end = add_months(billing_cycle_anchor, 1)
else:
raise AssertionError(f'Unknown billing_schedule: {billing_schedule}')
raise AssertionError('Unknown billing_schedule: {}'.format(billing_schedule))
if discount is not None:
# There are no fractional cents in Stripe, so round down to nearest integer.
price_per_license = int(float(price_per_license * (1 - discount / 100)) + .00001)
next_invoice_date = period_end
if automanage_licenses:
next_invoice_date = add_months(billing_cycle_anchor, 1)
if free_trial:
period_end = billing_cycle_anchor + timedelta(days=settings.FREE_TRIAL_DAYS)
next_invoice_date = period_end
return billing_cycle_anchor, next_invoice_date, period_end, price_per_license
# Only used for cloud signups
@@ -353,44 +276,38 @@ def process_initial_upgrade(user: UserProfile, licenses: int, automanage_license
billing_schedule: int, stripe_token: Optional[str]) -> None:
realm = user.realm
customer = update_or_create_stripe_customer(user, stripe_token=stripe_token)
charge_automatically = stripe_token is not None
free_trial = settings.FREE_TRIAL_DAYS not in (None, 0)
if get_current_plan_by_customer(customer) is not None:
if get_current_plan(customer) is not None:
# Unlikely race condition from two people upgrading (clicking "Make payment")
# at exactly the same time. Doesn't fully resolve the race condition, but having
# a check here reduces the likelihood.
billing_logger.warning(
"Customer %s trying to upgrade, but has an active subscription", customer,
)
"Customer {} trying to upgrade, but has an active subscription".format(customer))
raise BillingError('subscribing with existing subscription', BillingError.TRY_RELOADING)
billing_cycle_anchor, next_invoice_date, period_end, price_per_license = compute_plan_parameters(
automanage_licenses, billing_schedule, customer.default_discount, free_trial)
automanage_licenses, billing_schedule, customer.default_discount)
# The main design constraint in this function is that if you upgrade with a credit card, and the
# charge fails, everything should be rolled back as if nothing had happened. This is because we
# expect frequent card failures on initial signup.
# Hence, if we're going to charge a card, do it at the beginning, even if we later may have to
# adjust the number of licenses.
charge_automatically = stripe_token is not None
if charge_automatically:
if not free_trial:
stripe_charge = stripe.Charge.create(
amount=price_per_license * licenses,
currency='usd',
customer=customer.stripe_customer_id,
description=f"Upgrade to Zulip Standard, ${price_per_license/100} x {licenses}",
receipt_email=user.delivery_email,
statement_descriptor='Zulip Standard')
# Not setting a period start and end, but maybe we should? Unclear what will make things
# most similar to the renewal case from an accounting perspective.
assert isinstance(stripe_charge.source, stripe.Card)
description = f"Payment (Card ending in {stripe_charge.source.last4})"
stripe.InvoiceItem.create(
amount=price_per_license * licenses * -1,
currency='usd',
customer=customer.stripe_customer_id,
description=description,
discountable=False)
stripe_charge = stripe.Charge.create(
amount=price_per_license * licenses,
currency='usd',
customer=customer.stripe_customer_id,
description="Upgrade to Zulip Standard, ${} x {}".format(price_per_license/100, licenses),
receipt_email=user.delivery_email,
statement_descriptor='Zulip Standard')
# Not setting a period start and end, but maybe we should? Unclear what will make things
# most similar to the renewal case from an accounting perspective.
stripe.InvoiceItem.create(
amount=price_per_license * licenses * -1,
currency='usd',
customer=customer.stripe_customer_id,
description="Payment (Card ending in {})".format(cast(stripe.Card, stripe_charge.source).last4),
discountable=False)
# TODO: The correctness of this relies on user creation, deactivation, etc being
# in a transaction.atomic() with the relevant RealmAuditLog entries
@@ -406,8 +323,6 @@ def process_initial_upgrade(user: UserProfile, licenses: int, automanage_license
'billing_cycle_anchor': billing_cycle_anchor,
'billing_schedule': billing_schedule,
'tier': CustomerPlan.STANDARD}
if free_trial:
plan_params['status'] = CustomerPlan.FREE_TRIAL
plan = CustomerPlan.objects.create(
customer=customer,
next_invoice_date=next_invoice_date,
@@ -424,52 +339,49 @@ def process_initial_upgrade(user: UserProfile, licenses: int, automanage_license
realm=realm, acting_user=user, event_time=billing_cycle_anchor,
event_type=RealmAuditLog.CUSTOMER_PLAN_CREATED,
extra_data=ujson.dumps(plan_params))
stripe.InvoiceItem.create(
currency='usd',
customer=customer.stripe_customer_id,
description='Zulip Standard',
discountable=False,
period = {'start': datetime_to_timestamp(billing_cycle_anchor),
'end': datetime_to_timestamp(period_end)},
quantity=billed_licenses,
unit_amount=price_per_license)
if not free_trial:
stripe.InvoiceItem.create(
currency='usd',
customer=customer.stripe_customer_id,
description='Zulip Standard',
discountable=False,
period = {'start': datetime_to_timestamp(billing_cycle_anchor),
'end': datetime_to_timestamp(period_end)},
quantity=billed_licenses,
unit_amount=price_per_license)
if charge_automatically:
billing_method = 'charge_automatically'
days_until_due = None
else:
billing_method = 'send_invoice'
days_until_due = DEFAULT_INVOICE_DAYS_UNTIL_DUE
stripe_invoice = stripe.Invoice.create(
auto_advance=True,
billing=billing_method,
customer=customer.stripe_customer_id,
days_until_due=days_until_due,
statement_descriptor='Zulip Standard')
stripe.Invoice.finalize_invoice(stripe_invoice)
if charge_automatically:
billing_method = 'charge_automatically'
days_until_due = None
else:
billing_method = 'send_invoice'
days_until_due = DEFAULT_INVOICE_DAYS_UNTIL_DUE
stripe_invoice = stripe.Invoice.create(
auto_advance=True,
billing=billing_method,
customer=customer.stripe_customer_id,
days_until_due=days_until_due,
statement_descriptor='Zulip Standard')
stripe.Invoice.finalize_invoice(stripe_invoice)
from zerver.lib.actions import do_change_plan_type
do_change_plan_type(realm, Realm.STANDARD)
def update_license_ledger_for_automanaged_plan(realm: Realm, plan: CustomerPlan,
event_time: datetime) -> None:
new_plan, last_ledger_entry = make_end_of_cycle_updates_if_needed(plan, event_time)
last_ledger_entry = make_end_of_cycle_updates_if_needed(plan, event_time)
if last_ledger_entry is None:
return
if new_plan is not None:
plan = new_plan
licenses_at_next_renewal = get_latest_seat_count(realm)
licenses = max(licenses_at_next_renewal, last_ledger_entry.licenses)
LicenseLedger.objects.create(
plan=plan, event_time=event_time, licenses=licenses,
licenses_at_next_renewal=licenses_at_next_renewal)
def update_license_ledger_if_needed(realm: Realm, event_time: datetime) -> None:
plan = get_current_plan_by_realm(realm)
customer = Customer.objects.filter(realm=realm).first()
if customer is None:
return
plan = get_current_plan(customer)
if plan is None:
return
if not plan.automanage_licenses:
@@ -480,19 +392,12 @@ def invoice_plan(plan: CustomerPlan, event_time: datetime) -> None:
if plan.invoicing_status == CustomerPlan.STARTED:
raise NotImplementedError('Plan with invoicing_status==STARTED needs manual resolution.')
make_end_of_cycle_updates_if_needed(plan, event_time)
if plan.invoicing_status == CustomerPlan.INITIAL_INVOICE_TO_BE_SENT:
invoiced_through_id = -1
licenses_base = None
else:
assert(plan.invoiced_through is not None)
licenses_base = plan.invoiced_through.licenses
invoiced_through_id = plan.invoiced_through.id
assert(plan.invoiced_through is not None)
licenses_base = plan.invoiced_through.licenses
invoice_item_created = False
for ledger_entry in LicenseLedger.objects.filter(plan=plan, id__gt=invoiced_through_id,
for ledger_entry in LicenseLedger.objects.filter(plan=plan, id__gt=plan.invoiced_through.id,
event_time__lte=event_time).order_by('id'):
price_args: Dict[str, int] = {}
price_args = {} # type: Dict[str, int]
if ledger_entry.is_renewal:
if plan.fixed_price is not None:
price_args = {'amount': plan.fixed_price}
@@ -501,7 +406,7 @@ def invoice_plan(plan: CustomerPlan, event_time: datetime) -> None:
price_args = {'unit_amount': plan.price_per_license,
'quantity': ledger_entry.licenses}
description = "Zulip Standard - renewal"
elif licenses_base is not None and ledger_entry.licenses != licenses_base:
elif ledger_entry.licenses != licenses_base:
assert(plan.price_per_license)
last_renewal = LicenseLedger.objects.filter(
plan=plan, is_renewal=True, event_time__lte=ledger_entry.event_time) \
@@ -517,6 +422,9 @@ def invoice_plan(plan: CustomerPlan, event_time: datetime) -> None:
plan.invoiced_through = ledger_entry
plan.invoicing_status = CustomerPlan.STARTED
plan.save(update_fields=['invoicing_status', 'invoiced_through'])
idempotency_key = 'ledger_entry:{}'.format(ledger_entry.id) # type: Optional[str]
if settings.TEST_SUITE:
idempotency_key = None
stripe.InvoiceItem.create(
currency='usd',
customer=plan.customer.stripe_customer_id,
@@ -525,7 +433,7 @@ def invoice_plan(plan: CustomerPlan, event_time: datetime) -> None:
period = {'start': datetime_to_timestamp(ledger_entry.event_time),
'end': datetime_to_timestamp(
start_of_next_billing_cycle(plan, ledger_entry.event_time))},
idempotency_key=get_idempotency_key(ledger_entry),
idempotency_key=idempotency_key,
**price_args)
invoice_item_created = True
plan.invoiced_through = ledger_entry
@@ -558,13 +466,8 @@ def invoice_plans_as_needed(event_time: datetime=timezone_now()) -> None:
def attach_discount_to_realm(realm: Realm, discount: Decimal) -> None:
Customer.objects.update_or_create(realm=realm, defaults={'default_discount': discount})
def update_sponsorship_status(realm: Realm, sponsorship_pending: bool) -> None:
customer, _ = Customer.objects.get_or_create(realm=realm)
customer.sponsorship_pending = sponsorship_pending
customer.save(update_fields=["sponsorship_pending"])
def get_discount_for_realm(realm: Realm) -> Optional[Decimal]:
customer = get_customer_by_realm(realm)
customer = Customer.objects.filter(realm=realm).first()
if customer is not None:
return customer.default_discount
return None
@@ -572,10 +475,8 @@ def get_discount_for_realm(realm: Realm) -> Optional[Decimal]:
def do_change_plan_status(plan: CustomerPlan, status: int) -> None:
plan.status = status
plan.save(update_fields=['status'])
billing_logger.info(
'Change plan status: Customer.id: %s, CustomerPlan.id: %s, status: %s',
plan.customer.id, plan.id, status,
)
billing_logger.info('Change plan status: Customer.id: %s, CustomerPlan.id: %s, status: %s' % (
plan.customer.id, plan.id, status))
def process_downgrade(plan: CustomerPlan) -> None:
from zerver.lib.actions import do_change_plan_type
@@ -595,16 +496,3 @@ def estimate_annual_recurring_revenue_by_realm() -> Dict[str, int]: # nocoverag
# TODO: Decimal stuff
annual_revenue[plan.customer.realm.string_id] = int(renewal_cents / 100)
return annual_revenue
# During realm deactivation we instantly downgrade the plan to Limited.
# Extra users added in the final month are not charged. Also used
# for the cancelation of Free Trial.
def downgrade_now(realm: Realm) -> None:
plan = get_current_plan_by_realm(realm)
if plan is None:
return
process_downgrade(plan)
plan.invoiced_through = LicenseLedger.objects.filter(plan=plan).order_by('id').last()
plan.next_invoice_date = next_invoice_date(plan)
plan.save(update_fields=["invoiced_through", "next_invoice_date"])

View File

@@ -1,7 +1,9 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.14 on 2018-09-25 12:02
from __future__ import unicode_literals
import django.db.models.deletion
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):

View File

@@ -1,4 +1,6 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.16 on 2018-12-12 20:19
from __future__ import unicode_literals
from django.db import migrations, models

View File

@@ -1,7 +1,9 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.16 on 2018-12-22 21:05
from __future__ import unicode_literals
import django.db.models.deletion
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):

View File

@@ -1,7 +1,9 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.18 on 2019-01-19 05:01
from __future__ import unicode_literals
import django.db.models.deletion
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):

View File

@@ -1,7 +1,9 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.18 on 2019-01-28 13:04
from __future__ import unicode_literals
import django.db.models.deletion
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):

View File

@@ -1,4 +1,6 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.18 on 2019-01-29 01:46
from __future__ import unicode_literals
from django.db import migrations, models

View File

@@ -1,4 +1,6 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.18 on 2019-01-31 22:16
from __future__ import unicode_literals
from django.db import migrations

View File

@@ -1,4 +1,6 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.20 on 2019-04-11 00:45
from __future__ import unicode_literals
from django.db import migrations, models

View File

@@ -1,18 +0,0 @@
# Generated by Django 2.2.13 on 2020-06-09 12:09
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('corporate', '0008_nullable_next_invoice_date'),
]
operations = [
migrations.AddField(
model_name='customer',
name='sponsorship_pending',
field=models.BooleanField(default=False),
),
]

View File

@@ -7,81 +7,67 @@ from django.db.models import CASCADE
from zerver.models import Realm
class Customer(models.Model):
realm: Realm = models.OneToOneField(Realm, on_delete=CASCADE)
stripe_customer_id: str = models.CharField(max_length=255, null=True, unique=True)
sponsorship_pending: bool = models.BooleanField(default=False)
realm = models.OneToOneField(Realm, on_delete=CASCADE) # type: Realm
stripe_customer_id = models.CharField(max_length=255, null=True, unique=True) # type: str
# A percentage, like 85.
default_discount: Optional[Decimal] = models.DecimalField(decimal_places=4, max_digits=7, null=True)
default_discount = models.DecimalField(decimal_places=4, max_digits=7, null=True) # type: Optional[Decimal]
def __str__(self) -> str:
return f"<Customer {self.realm} {self.stripe_customer_id}>"
def get_customer_by_realm(realm: Realm) -> Optional[Customer]:
return Customer.objects.filter(realm=realm).first()
return "<Customer %s %s>" % (self.realm, self.stripe_customer_id)
class CustomerPlan(models.Model):
customer: Customer = models.ForeignKey(Customer, on_delete=CASCADE)
automanage_licenses: bool = models.BooleanField(default=False)
charge_automatically: bool = models.BooleanField(default=False)
customer = models.ForeignKey(Customer, on_delete=CASCADE) # type: Customer
automanage_licenses = models.BooleanField(default=False) # type: bool
charge_automatically = models.BooleanField(default=False) # type: bool
# Both of these are in cents. Exactly one of price_per_license or
# fixed_price should be set. fixed_price is only for manual deals, and
# can't be set via the self-serve billing system.
price_per_license: Optional[int] = models.IntegerField(null=True)
fixed_price: Optional[int] = models.IntegerField(null=True)
price_per_license = models.IntegerField(null=True) # type: Optional[int]
fixed_price = models.IntegerField(null=True) # type: Optional[int]
# Discount that was applied. For display purposes only.
discount: Optional[Decimal] = models.DecimalField(decimal_places=4, max_digits=6, null=True)
discount = models.DecimalField(decimal_places=4, max_digits=6, null=True) # type: Optional[Decimal]
billing_cycle_anchor: datetime.datetime = models.DateTimeField()
billing_cycle_anchor = models.DateTimeField() # type: datetime.datetime
ANNUAL = 1
MONTHLY = 2
billing_schedule: int = models.SmallIntegerField()
billing_schedule = models.SmallIntegerField() # type: int
next_invoice_date: Optional[datetime.datetime] = models.DateTimeField(db_index=True, null=True)
invoiced_through: Optional["LicenseLedger"] = models.ForeignKey(
'LicenseLedger', null=True, on_delete=CASCADE, related_name='+')
next_invoice_date = models.DateTimeField(db_index=True, null=True) # type: Optional[datetime.datetime]
invoiced_through = models.ForeignKey(
'LicenseLedger', null=True, on_delete=CASCADE, related_name='+') # type: Optional[LicenseLedger]
DONE = 1
STARTED = 2
INITIAL_INVOICE_TO_BE_SENT = 3
invoicing_status: int = models.SmallIntegerField(default=DONE)
invoicing_status = models.SmallIntegerField(default=DONE) # type: int
STANDARD = 1
PLUS = 2 # not available through self-serve signup
ENTERPRISE = 10
tier: int = models.SmallIntegerField()
tier = models.SmallIntegerField() # type: int
ACTIVE = 1
DOWNGRADE_AT_END_OF_CYCLE = 2
FREE_TRIAL = 3
SWITCH_TO_ANNUAL_AT_END_OF_CYCLE = 4
# "Live" plans should have a value < LIVE_STATUS_THRESHOLD.
# There should be at most one live plan per customer.
LIVE_STATUS_THRESHOLD = 10
ENDED = 11
NEVER_STARTED = 12
status: int = models.SmallIntegerField(default=ACTIVE)
status = models.SmallIntegerField(default=ACTIVE) # type: int
# TODO maybe override setattr to ensure billing_cycle_anchor, etc are immutable
def get_current_plan_by_customer(customer: Customer) -> Optional[CustomerPlan]:
def get_current_plan(customer: Customer) -> Optional[CustomerPlan]:
return CustomerPlan.objects.filter(
customer=customer, status__lt=CustomerPlan.LIVE_STATUS_THRESHOLD).first()
def get_current_plan_by_realm(realm: Realm) -> Optional[CustomerPlan]:
customer = get_customer_by_realm(realm)
if customer is None:
return None
return get_current_plan_by_customer(customer)
class LicenseLedger(models.Model):
plan: CustomerPlan = models.ForeignKey(CustomerPlan, on_delete=CASCADE)
plan = models.ForeignKey(CustomerPlan, on_delete=CASCADE) # type: CustomerPlan
# Also True for the initial upgrade.
is_renewal: bool = models.BooleanField(default=False)
event_time: datetime.datetime = models.DateTimeField()
licenses: int = models.IntegerField()
is_renewal = models.BooleanField(default=False) # type: bool
event_time = models.DateTimeField() # type: datetime.datetime
licenses = models.IntegerField() # type: int
# None means the plan does not automatically renew.
# This cannot be None if plan.automanage_licenses.
licenses_at_next_renewal: Optional[int] = models.IntegerField(null=True)
licenses_at_next_renewal = models.IntegerField(null=True) # type: Optional[int]

View File

@@ -1,6 +0,0 @@
{
"data": [],
"has_more": false,
"object": "list",
"url": "/v1/charges"
}

View File

@@ -1,104 +0,0 @@
{
"account_balance": 0,
"address": null,
"balance": 0,
"created": 1010000001,
"currency": null,
"default_source": {
"address_city": "Pacific",
"address_country": "United States",
"address_line1": "Under the sea,",
"address_line1_check": "pass",
"address_line2": null,
"address_state": null,
"address_zip": "33333",
"address_zip_check": "pass",
"brand": "Visa",
"country": "US",
"customer": "cus_NORMALIZED0001",
"cvc_check": "pass",
"dynamic_last4": null,
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"id": "card_NORMALIZED00000000000001",
"last4": "4242",
"metadata": {},
"name": "Ada Starr",
"object": "card",
"tokenization_method": null
},
"delinquent": false,
"description": "zulip (Zulip Dev)",
"discount": null,
"email": "hamlet@zulip.com",
"id": "cus_NORMALIZED0001",
"invoice_prefix": "NORMA01",
"invoice_settings": {
"custom_fields": null,
"default_payment_method": null,
"footer": null
},
"livemode": false,
"metadata": {
"realm_id": "1",
"realm_str": "zulip"
},
"name": null,
"next_invoice_sequence": 1,
"object": "customer",
"phone": null,
"preferred_locales": [],
"shipping": null,
"sources": {
"data": [
{
"address_city": "Pacific",
"address_country": "United States",
"address_line1": "Under the sea,",
"address_line1_check": "pass",
"address_line2": null,
"address_state": null,
"address_zip": "33333",
"address_zip_check": "pass",
"brand": "Visa",
"country": "US",
"customer": "cus_NORMALIZED0001",
"cvc_check": "pass",
"dynamic_last4": null,
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"id": "card_NORMALIZED00000000000001",
"last4": "4242",
"metadata": {},
"name": "Ada Starr",
"object": "card",
"tokenization_method": null
}
],
"has_more": false,
"object": "list",
"total_count": 1,
"url": "/v1/customers/cus_NORMALIZED0001/sources"
},
"subscriptions": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/customers/cus_NORMALIZED0001/subscriptions"
},
"tax_exempt": "none",
"tax_ids": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/customers/cus_NORMALIZED0001/tax_ids"
},
"tax_info": null,
"tax_info_verification": null
}

View File

@@ -1,104 +0,0 @@
{
"account_balance": 0,
"address": null,
"balance": 0,
"created": 1010000001,
"currency": null,
"default_source": {
"address_city": "Pacific",
"address_country": "United States",
"address_line1": "Under the sea,",
"address_line1_check": "pass",
"address_line2": null,
"address_state": null,
"address_zip": "33333",
"address_zip_check": "pass",
"brand": "Visa",
"country": "US",
"customer": "cus_NORMALIZED0001",
"cvc_check": "pass",
"dynamic_last4": null,
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"id": "card_NORMALIZED00000000000001",
"last4": "4242",
"metadata": {},
"name": "Ada Starr",
"object": "card",
"tokenization_method": null
},
"delinquent": false,
"description": "zulip (Zulip Dev)",
"discount": null,
"email": "hamlet@zulip.com",
"id": "cus_NORMALIZED0001",
"invoice_prefix": "NORMA01",
"invoice_settings": {
"custom_fields": null,
"default_payment_method": null,
"footer": null
},
"livemode": false,
"metadata": {
"realm_id": "1",
"realm_str": "zulip"
},
"name": null,
"next_invoice_sequence": 1,
"object": "customer",
"phone": null,
"preferred_locales": [],
"shipping": null,
"sources": {
"data": [
{
"address_city": "Pacific",
"address_country": "United States",
"address_line1": "Under the sea,",
"address_line1_check": "pass",
"address_line2": null,
"address_state": null,
"address_zip": "33333",
"address_zip_check": "pass",
"brand": "Visa",
"country": "US",
"customer": "cus_NORMALIZED0001",
"cvc_check": "pass",
"dynamic_last4": null,
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"id": "card_NORMALIZED00000000000001",
"last4": "4242",
"metadata": {},
"name": "Ada Starr",
"object": "card",
"tokenization_method": null
}
],
"has_more": false,
"object": "list",
"total_count": 1,
"url": "/v1/customers/cus_NORMALIZED0001/sources"
},
"subscriptions": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/customers/cus_NORMALIZED0001/subscriptions"
},
"tax_exempt": "none",
"tax_ids": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/customers/cus_NORMALIZED0001/tax_ids"
},
"tax_info": null,
"tax_info_verification": null
}

View File

@@ -1,104 +0,0 @@
{
"account_balance": 0,
"address": null,
"balance": 0,
"created": 1010000001,
"currency": null,
"default_source": {
"address_city": "Pacific",
"address_country": "United States",
"address_line1": "Under the sea,",
"address_line1_check": "pass",
"address_line2": null,
"address_state": null,
"address_zip": "33333",
"address_zip_check": "pass",
"brand": "Visa",
"country": "US",
"customer": "cus_NORMALIZED0001",
"cvc_check": "pass",
"dynamic_last4": null,
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"id": "card_NORMALIZED00000000000001",
"last4": "4242",
"metadata": {},
"name": "Ada Starr",
"object": "card",
"tokenization_method": null
},
"delinquent": false,
"description": "zulip (Zulip Dev)",
"discount": null,
"email": "hamlet@zulip.com",
"id": "cus_NORMALIZED0001",
"invoice_prefix": "NORMA01",
"invoice_settings": {
"custom_fields": null,
"default_payment_method": null,
"footer": null
},
"livemode": false,
"metadata": {
"realm_id": "1",
"realm_str": "zulip"
},
"name": null,
"next_invoice_sequence": 1,
"object": "customer",
"phone": null,
"preferred_locales": [],
"shipping": null,
"sources": {
"data": [
{
"address_city": "Pacific",
"address_country": "United States",
"address_line1": "Under the sea,",
"address_line1_check": "pass",
"address_line2": null,
"address_state": null,
"address_zip": "33333",
"address_zip_check": "pass",
"brand": "Visa",
"country": "US",
"customer": "cus_NORMALIZED0001",
"cvc_check": "pass",
"dynamic_last4": null,
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"id": "card_NORMALIZED00000000000001",
"last4": "4242",
"metadata": {},
"name": "Ada Starr",
"object": "card",
"tokenization_method": null
}
],
"has_more": false,
"object": "list",
"total_count": 1,
"url": "/v1/customers/cus_NORMALIZED0001/sources"
},
"subscriptions": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/customers/cus_NORMALIZED0001/subscriptions"
},
"tax_exempt": "none",
"tax_ids": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/customers/cus_NORMALIZED0001/tax_ids"
},
"tax_info": null,
"tax_info_verification": null
}

View File

@@ -1,96 +0,0 @@
{
"account_country": "US",
"account_name": "Dev account",
"amount_due": 120000,
"amount_paid": 0,
"amount_remaining": 120000,
"application_fee_amount": null,
"attempt_count": 0,
"attempted": false,
"auto_advance": true,
"billing": "charge_automatically",
"billing_reason": "manual",
"charge": null,
"collection_method": "charge_automatically",
"created": 1010000002,
"currency": "usd",
"custom_fields": null,
"customer": "cus_NORMALIZED0001",
"customer_address": null,
"customer_email": "hamlet@zulip.com",
"customer_name": null,
"customer_phone": null,
"customer_shipping": null,
"customer_tax_exempt": "none",
"customer_tax_ids": [],
"default_payment_method": null,
"default_source": null,
"default_tax_rates": [],
"description": "",
"discount": null,
"due_date": null,
"ending_balance": null,
"footer": null,
"hosted_invoice_url": null,
"id": "in_NORMALIZED00000000000001",
"invoice_pdf": null,
"lines": {
"data": [
{
"amount": 120000,
"currency": "usd",
"description": "Zulip Standard - renewal",
"discountable": false,
"id": "ii_NORMALIZED00000000000001",
"invoice_item": "ii_NORMALIZED00000000000001",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1362193445,
"start": 1330657445
},
"plan": null,
"proration": false,
"quantity": 15,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem",
"unique_id": "il_1Gb9aAGh0CmXqmnwYD2vuFL3"
}
],
"has_more": false,
"object": "list",
"total_count": 1,
"url": "/v1/invoices/in_NORMALIZED00000000000001/lines"
},
"livemode": false,
"metadata": {},
"next_payment_attempt": 1000000000,
"number": "NORMALI-0001",
"object": "invoice",
"paid": false,
"payment_intent": null,
"period_end": 1000000000,
"period_start": 1000000000,
"post_payment_credit_notes_amount": 0,
"pre_payment_credit_notes_amount": 0,
"receipt_number": null,
"starting_balance": 0,
"statement_descriptor": "Zulip Standard",
"status": "draft",
"status_transitions": {
"finalized_at": null,
"marked_uncollectible_at": null,
"paid_at": null,
"voided_at": null
},
"subscription": null,
"subtotal": 120000,
"tax": null,
"tax_percent": null,
"total": 120000,
"total_tax_amounts": [],
"webhooks_delivered_at": 1000000000
}

View File

@@ -1,96 +0,0 @@
{
"account_country": "US",
"account_name": "Dev account",
"amount_due": 5172,
"amount_paid": 0,
"amount_remaining": 5172,
"application_fee_amount": null,
"attempt_count": 0,
"attempted": false,
"auto_advance": true,
"billing": "charge_automatically",
"billing_reason": "manual",
"charge": null,
"collection_method": "charge_automatically",
"created": 1010000003,
"currency": "usd",
"custom_fields": null,
"customer": "cus_NORMALIZED0001",
"customer_address": null,
"customer_email": "hamlet@zulip.com",
"customer_name": null,
"customer_phone": null,
"customer_shipping": null,
"customer_tax_exempt": "none",
"customer_tax_ids": [],
"default_payment_method": null,
"default_source": null,
"default_tax_rates": [],
"description": "",
"discount": null,
"due_date": null,
"ending_balance": null,
"footer": null,
"hosted_invoice_url": null,
"id": "in_NORMALIZED00000000000002",
"invoice_pdf": null,
"lines": {
"data": [
{
"amount": 5172,
"currency": "usd",
"description": "Additional license (Jan 2, 2013 - Mar 2, 2013)",
"discountable": false,
"id": "ii_NORMALIZED00000000000002",
"invoice_item": "ii_NORMALIZED00000000000002",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1362193445,
"start": 1357095845
},
"plan": null,
"proration": false,
"quantity": 4,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem",
"unique_id": "il_1Gb9aCGh0CmXqmnwp7mzzDq1"
}
],
"has_more": false,
"object": "list",
"total_count": 1,
"url": "/v1/invoices/in_NORMALIZED00000000000002/lines"
},
"livemode": false,
"metadata": {},
"next_payment_attempt": 1000000000,
"number": "NORMALI-0002",
"object": "invoice",
"paid": false,
"payment_intent": null,
"period_end": 1000000000,
"period_start": 1000000000,
"post_payment_credit_notes_amount": 0,
"pre_payment_credit_notes_amount": 0,
"receipt_number": null,
"starting_balance": 0,
"statement_descriptor": "Zulip Standard",
"status": "draft",
"status_transitions": {
"finalized_at": null,
"marked_uncollectible_at": null,
"paid_at": null,
"voided_at": null
},
"subscription": null,
"subtotal": 5172,
"tax": null,
"tax_percent": null,
"total": 5172,
"total_tax_amounts": [],
"webhooks_delivered_at": 1000000000
}

View File

@@ -1,96 +0,0 @@
{
"account_country": "US",
"account_name": "Dev account",
"amount_due": 152000,
"amount_paid": 0,
"amount_remaining": 152000,
"application_fee_amount": null,
"attempt_count": 0,
"attempted": false,
"auto_advance": true,
"billing": "charge_automatically",
"billing_reason": "manual",
"charge": null,
"collection_method": "charge_automatically",
"created": 1010000004,
"currency": "usd",
"custom_fields": null,
"customer": "cus_NORMALIZED0001",
"customer_address": null,
"customer_email": "hamlet@zulip.com",
"customer_name": null,
"customer_phone": null,
"customer_shipping": null,
"customer_tax_exempt": "none",
"customer_tax_ids": [],
"default_payment_method": null,
"default_source": null,
"default_tax_rates": [],
"description": "",
"discount": null,
"due_date": null,
"ending_balance": null,
"footer": null,
"hosted_invoice_url": null,
"id": "in_NORMALIZED00000000000003",
"invoice_pdf": null,
"lines": {
"data": [
{
"amount": 152000,
"currency": "usd",
"description": "Zulip Standard - renewal",
"discountable": false,
"id": "ii_NORMALIZED00000000000003",
"invoice_item": "ii_NORMALIZED00000000000003",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1393729445,
"start": 1362193445
},
"plan": null,
"proration": false,
"quantity": 19,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem",
"unique_id": "il_1Gb9aEGh0CmXqmnwbJpsbILw"
}
],
"has_more": false,
"object": "list",
"total_count": 1,
"url": "/v1/invoices/in_NORMALIZED00000000000003/lines"
},
"livemode": false,
"metadata": {},
"next_payment_attempt": 1000000000,
"number": "NORMALI-0003",
"object": "invoice",
"paid": false,
"payment_intent": null,
"period_end": 1000000000,
"period_start": 1000000000,
"post_payment_credit_notes_amount": 0,
"pre_payment_credit_notes_amount": 0,
"receipt_number": null,
"starting_balance": 0,
"statement_descriptor": "Zulip Standard",
"status": "draft",
"status_transitions": {
"finalized_at": null,
"marked_uncollectible_at": null,
"paid_at": null,
"voided_at": null
},
"subscription": null,
"subtotal": 152000,
"tax": null,
"tax_percent": null,
"total": 152000,
"total_tax_amounts": [],
"webhooks_delivered_at": 1000000000
}

View File

@@ -1,96 +0,0 @@
{
"account_country": "US",
"account_name": "Dev account",
"amount_due": 120000,
"amount_paid": 0,
"amount_remaining": 120000,
"application_fee_amount": null,
"attempt_count": 0,
"attempted": false,
"auto_advance": true,
"billing": "charge_automatically",
"billing_reason": "manual",
"charge": null,
"collection_method": "charge_automatically",
"created": 1010000002,
"currency": "usd",
"custom_fields": null,
"customer": "cus_NORMALIZED0001",
"customer_address": null,
"customer_email": "hamlet@zulip.com",
"customer_name": null,
"customer_phone": null,
"customer_shipping": null,
"customer_tax_exempt": "none",
"customer_tax_ids": [],
"default_payment_method": null,
"default_source": null,
"default_tax_rates": [],
"description": "",
"discount": null,
"due_date": 1000000000,
"ending_balance": 0,
"footer": null,
"hosted_invoice_url": "https://pay.stripe.com/invoice/acct_NORMALIZED000001/invst_NORMALIZED0000000000000001PVAbb",
"id": "in_NORMALIZED00000000000001",
"invoice_pdf": "https://pay.stripe.com/invoice/acct_NORMALIZED000001/invst_NORMALIZED0000000000000001PVAbb/pdf",
"lines": {
"data": [
{
"amount": 120000,
"currency": "usd",
"description": "Zulip Standard - renewal",
"discountable": false,
"id": "ii_NORMALIZED00000000000001",
"invoice_item": "ii_NORMALIZED00000000000001",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1362193445,
"start": 1330657445
},
"plan": null,
"proration": false,
"quantity": 15,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem",
"unique_id": "il_1Gb9aAGh0CmXqmnwYD2vuFL3"
}
],
"has_more": false,
"object": "list",
"total_count": 1,
"url": "/v1/invoices/in_NORMALIZED00000000000001/lines"
},
"livemode": false,
"metadata": {},
"next_payment_attempt": 1000000000,
"number": "NORMALI-0001",
"object": "invoice",
"paid": false,
"payment_intent": "pi_1Gb9aBGh0CmXqmnw3RdjXFtK",
"period_end": 1000000000,
"period_start": 1000000000,
"post_payment_credit_notes_amount": 0,
"pre_payment_credit_notes_amount": 0,
"receipt_number": null,
"starting_balance": 0,
"statement_descriptor": "Zulip Standard",
"status": "open",
"status_transitions": {
"finalized_at": 1000000000,
"marked_uncollectible_at": null,
"paid_at": null,
"voided_at": null
},
"subscription": null,
"subtotal": 120000,
"tax": null,
"tax_percent": null,
"total": 120000,
"total_tax_amounts": [],
"webhooks_delivered_at": 1000000000
}

View File

@@ -1,96 +0,0 @@
{
"account_country": "US",
"account_name": "Dev account",
"amount_due": 5172,
"amount_paid": 0,
"amount_remaining": 5172,
"application_fee_amount": null,
"attempt_count": 0,
"attempted": false,
"auto_advance": true,
"billing": "charge_automatically",
"billing_reason": "manual",
"charge": null,
"collection_method": "charge_automatically",
"created": 1010000003,
"currency": "usd",
"custom_fields": null,
"customer": "cus_NORMALIZED0001",
"customer_address": null,
"customer_email": "hamlet@zulip.com",
"customer_name": null,
"customer_phone": null,
"customer_shipping": null,
"customer_tax_exempt": "none",
"customer_tax_ids": [],
"default_payment_method": null,
"default_source": null,
"default_tax_rates": [],
"description": "",
"discount": null,
"due_date": 1000000000,
"ending_balance": 0,
"footer": null,
"hosted_invoice_url": "https://pay.stripe.com/invoice/acct_NORMALIZED000001/invst_NORMALIZED00000000000000027eQ4i",
"id": "in_NORMALIZED00000000000002",
"invoice_pdf": "https://pay.stripe.com/invoice/acct_NORMALIZED000001/invst_NORMALIZED00000000000000027eQ4i/pdf",
"lines": {
"data": [
{
"amount": 5172,
"currency": "usd",
"description": "Additional license (Jan 2, 2013 - Mar 2, 2013)",
"discountable": false,
"id": "ii_NORMALIZED00000000000002",
"invoice_item": "ii_NORMALIZED00000000000002",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1362193445,
"start": 1357095845
},
"plan": null,
"proration": false,
"quantity": 4,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem",
"unique_id": "il_1Gb9aCGh0CmXqmnwp7mzzDq1"
}
],
"has_more": false,
"object": "list",
"total_count": 1,
"url": "/v1/invoices/in_NORMALIZED00000000000002/lines"
},
"livemode": false,
"metadata": {},
"next_payment_attempt": 1000000000,
"number": "NORMALI-0002",
"object": "invoice",
"paid": false,
"payment_intent": "pi_1Gb9aDGh0CmXqmnwTdKviFVy",
"period_end": 1000000000,
"period_start": 1000000000,
"post_payment_credit_notes_amount": 0,
"pre_payment_credit_notes_amount": 0,
"receipt_number": null,
"starting_balance": 0,
"statement_descriptor": "Zulip Standard",
"status": "open",
"status_transitions": {
"finalized_at": 1000000000,
"marked_uncollectible_at": null,
"paid_at": null,
"voided_at": null
},
"subscription": null,
"subtotal": 5172,
"tax": null,
"tax_percent": null,
"total": 5172,
"total_tax_amounts": [],
"webhooks_delivered_at": 1000000000
}

View File

@@ -1,96 +0,0 @@
{
"account_country": "US",
"account_name": "Dev account",
"amount_due": 152000,
"amount_paid": 0,
"amount_remaining": 152000,
"application_fee_amount": null,
"attempt_count": 0,
"attempted": false,
"auto_advance": true,
"billing": "charge_automatically",
"billing_reason": "manual",
"charge": null,
"collection_method": "charge_automatically",
"created": 1010000004,
"currency": "usd",
"custom_fields": null,
"customer": "cus_NORMALIZED0001",
"customer_address": null,
"customer_email": "hamlet@zulip.com",
"customer_name": null,
"customer_phone": null,
"customer_shipping": null,
"customer_tax_exempt": "none",
"customer_tax_ids": [],
"default_payment_method": null,
"default_source": null,
"default_tax_rates": [],
"description": "",
"discount": null,
"due_date": 1000000000,
"ending_balance": 0,
"footer": null,
"hosted_invoice_url": "https://pay.stripe.com/invoice/acct_NORMALIZED000001/invst_NORMALIZED0000000000000003u5uCV",
"id": "in_NORMALIZED00000000000003",
"invoice_pdf": "https://pay.stripe.com/invoice/acct_NORMALIZED000001/invst_NORMALIZED0000000000000003u5uCV/pdf",
"lines": {
"data": [
{
"amount": 152000,
"currency": "usd",
"description": "Zulip Standard - renewal",
"discountable": false,
"id": "ii_NORMALIZED00000000000003",
"invoice_item": "ii_NORMALIZED00000000000003",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1393729445,
"start": 1362193445
},
"plan": null,
"proration": false,
"quantity": 19,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem",
"unique_id": "il_1Gb9aEGh0CmXqmnwbJpsbILw"
}
],
"has_more": false,
"object": "list",
"total_count": 1,
"url": "/v1/invoices/in_NORMALIZED00000000000003/lines"
},
"livemode": false,
"metadata": {},
"next_payment_attempt": 1000000000,
"number": "NORMALI-0003",
"object": "invoice",
"paid": false,
"payment_intent": "pi_1Gb9aGGh0CmXqmnwehKKVEuG",
"period_end": 1000000000,
"period_start": 1000000000,
"post_payment_credit_notes_amount": 0,
"pre_payment_credit_notes_amount": 0,
"receipt_number": null,
"starting_balance": 0,
"statement_descriptor": "Zulip Standard",
"status": "open",
"status_transitions": {
"finalized_at": 1000000000,
"marked_uncollectible_at": null,
"paid_at": null,
"voided_at": null
},
"subscription": null,
"subtotal": 152000,
"tax": null,
"tax_percent": null,
"total": 152000,
"total_tax_amounts": [],
"webhooks_delivered_at": 1000000000
}

View File

@@ -1,6 +0,0 @@
{
"data": [],
"has_more": false,
"object": "list",
"url": "/v1/invoices"
}

View File

@@ -1,6 +0,0 @@
{
"data": [],
"has_more": false,
"object": "list",
"url": "/v1/invoices"
}

View File

@@ -1,103 +0,0 @@
{
"data": [
{
"account_country": "US",
"account_name": "Dev account",
"amount_due": 120000,
"amount_paid": 0,
"amount_remaining": 120000,
"application_fee_amount": null,
"attempt_count": 0,
"attempted": false,
"auto_advance": true,
"billing": "charge_automatically",
"billing_reason": "manual",
"charge": null,
"collection_method": "charge_automatically",
"created": 1010000002,
"currency": "usd",
"custom_fields": null,
"customer": "cus_NORMALIZED0001",
"customer_address": null,
"customer_email": "hamlet@zulip.com",
"customer_name": null,
"customer_phone": null,
"customer_shipping": null,
"customer_tax_exempt": "none",
"customer_tax_ids": [],
"default_payment_method": null,
"default_source": null,
"default_tax_rates": [],
"description": "",
"discount": null,
"due_date": 1000000000,
"ending_balance": 0,
"footer": null,
"hosted_invoice_url": "https://pay.stripe.com/invoice/acct_NORMALIZED000001/invst_NORMALIZED0000000000000001PVAbb",
"id": "in_NORMALIZED00000000000001",
"invoice_pdf": "https://pay.stripe.com/invoice/acct_NORMALIZED000001/invst_NORMALIZED0000000000000001PVAbb/pdf",
"lines": {
"data": [
{
"amount": 120000,
"currency": "usd",
"description": "Zulip Standard - renewal",
"discountable": false,
"id": "ii_NORMALIZED00000000000001",
"invoice_item": "ii_NORMALIZED00000000000001",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1362193445,
"start": 1330657445
},
"plan": null,
"proration": false,
"quantity": 15,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem",
"unique_id": "il_1Gb9aAGh0CmXqmnwYD2vuFL3"
}
],
"has_more": false,
"object": "list",
"total_count": 1,
"url": "/v1/invoices/in_NORMALIZED00000000000001/lines"
},
"livemode": false,
"metadata": {},
"next_payment_attempt": 1000000000,
"number": "NORMALI-0001",
"object": "invoice",
"paid": false,
"payment_intent": "pi_1Gb9aBGh0CmXqmnw3RdjXFtK",
"period_end": 1000000000,
"period_start": 1000000000,
"post_payment_credit_notes_amount": 0,
"pre_payment_credit_notes_amount": 0,
"receipt_number": null,
"starting_balance": 0,
"statement_descriptor": "Zulip Standard",
"status": "open",
"status_transitions": {
"finalized_at": 1000000000,
"marked_uncollectible_at": null,
"paid_at": null,
"voided_at": null
},
"subscription": null,
"subtotal": 120000,
"tax": null,
"tax_percent": null,
"total": 120000,
"total_tax_amounts": [],
"webhooks_delivered_at": 1000000000
}
],
"has_more": false,
"object": "list",
"url": "/v1/invoices"
}

View File

@@ -1,103 +0,0 @@
{
"data": [
{
"account_country": "US",
"account_name": "Dev account",
"amount_due": 120000,
"amount_paid": 0,
"amount_remaining": 120000,
"application_fee_amount": null,
"attempt_count": 0,
"attempted": false,
"auto_advance": true,
"billing": "charge_automatically",
"billing_reason": "manual",
"charge": null,
"collection_method": "charge_automatically",
"created": 1010000002,
"currency": "usd",
"custom_fields": null,
"customer": "cus_NORMALIZED0001",
"customer_address": null,
"customer_email": "hamlet@zulip.com",
"customer_name": null,
"customer_phone": null,
"customer_shipping": null,
"customer_tax_exempt": "none",
"customer_tax_ids": [],
"default_payment_method": null,
"default_source": null,
"default_tax_rates": [],
"description": "",
"discount": null,
"due_date": 1000000000,
"ending_balance": 0,
"footer": null,
"hosted_invoice_url": "https://pay.stripe.com/invoice/acct_NORMALIZED000001/invst_NORMALIZED0000000000000001PVAbb",
"id": "in_NORMALIZED00000000000001",
"invoice_pdf": "https://pay.stripe.com/invoice/acct_NORMALIZED000001/invst_NORMALIZED0000000000000001PVAbb/pdf",
"lines": {
"data": [
{
"amount": 120000,
"currency": "usd",
"description": "Zulip Standard - renewal",
"discountable": false,
"id": "ii_NORMALIZED00000000000001",
"invoice_item": "ii_NORMALIZED00000000000001",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1362193445,
"start": 1330657445
},
"plan": null,
"proration": false,
"quantity": 15,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem",
"unique_id": "il_1Gb9aAGh0CmXqmnwYD2vuFL3"
}
],
"has_more": false,
"object": "list",
"total_count": 1,
"url": "/v1/invoices/in_NORMALIZED00000000000001/lines"
},
"livemode": false,
"metadata": {},
"next_payment_attempt": 1000000000,
"number": "NORMALI-0001",
"object": "invoice",
"paid": false,
"payment_intent": "pi_1Gb9aBGh0CmXqmnw3RdjXFtK",
"period_end": 1000000000,
"period_start": 1000000000,
"post_payment_credit_notes_amount": 0,
"pre_payment_credit_notes_amount": 0,
"receipt_number": null,
"starting_balance": 0,
"statement_descriptor": "Zulip Standard",
"status": "open",
"status_transitions": {
"finalized_at": 1000000000,
"marked_uncollectible_at": null,
"paid_at": null,
"voided_at": null
},
"subscription": null,
"subtotal": 120000,
"tax": null,
"tax_percent": null,
"total": 120000,
"total_tax_amounts": [],
"webhooks_delivered_at": 1000000000
}
],
"has_more": false,
"object": "list",
"url": "/v1/invoices"
}

View File

@@ -1,199 +0,0 @@
{
"data": [
{
"account_country": "US",
"account_name": "Dev account",
"amount_due": 5172,
"amount_paid": 0,
"amount_remaining": 5172,
"application_fee_amount": null,
"attempt_count": 0,
"attempted": false,
"auto_advance": true,
"billing": "charge_automatically",
"billing_reason": "manual",
"charge": null,
"collection_method": "charge_automatically",
"created": 1010000003,
"currency": "usd",
"custom_fields": null,
"customer": "cus_NORMALIZED0001",
"customer_address": null,
"customer_email": "hamlet@zulip.com",
"customer_name": null,
"customer_phone": null,
"customer_shipping": null,
"customer_tax_exempt": "none",
"customer_tax_ids": [],
"default_payment_method": null,
"default_source": null,
"default_tax_rates": [],
"description": "",
"discount": null,
"due_date": 1000000000,
"ending_balance": 0,
"footer": null,
"hosted_invoice_url": "https://pay.stripe.com/invoice/acct_NORMALIZED000001/invst_NORMALIZED00000000000000027eQ4i",
"id": "in_NORMALIZED00000000000002",
"invoice_pdf": "https://pay.stripe.com/invoice/acct_NORMALIZED000001/invst_NORMALIZED00000000000000027eQ4i/pdf",
"lines": {
"data": [
{
"amount": 5172,
"currency": "usd",
"description": "Additional license (Jan 2, 2013 - Mar 2, 2013)",
"discountable": false,
"id": "ii_NORMALIZED00000000000002",
"invoice_item": "ii_NORMALIZED00000000000002",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1362193445,
"start": 1357095845
},
"plan": null,
"proration": false,
"quantity": 4,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem",
"unique_id": "il_1Gb9aCGh0CmXqmnwp7mzzDq1"
}
],
"has_more": false,
"object": "list",
"total_count": 1,
"url": "/v1/invoices/in_NORMALIZED00000000000002/lines"
},
"livemode": false,
"metadata": {},
"next_payment_attempt": 1000000000,
"number": "NORMALI-0002",
"object": "invoice",
"paid": false,
"payment_intent": "pi_1Gb9aDGh0CmXqmnwTdKviFVy",
"period_end": 1000000000,
"period_start": 1000000000,
"post_payment_credit_notes_amount": 0,
"pre_payment_credit_notes_amount": 0,
"receipt_number": null,
"starting_balance": 0,
"statement_descriptor": "Zulip Standard",
"status": "open",
"status_transitions": {
"finalized_at": 1000000000,
"marked_uncollectible_at": null,
"paid_at": null,
"voided_at": null
},
"subscription": null,
"subtotal": 5172,
"tax": null,
"tax_percent": null,
"total": 5172,
"total_tax_amounts": [],
"webhooks_delivered_at": 1000000000
},
{
"account_country": "US",
"account_name": "Dev account",
"amount_due": 120000,
"amount_paid": 0,
"amount_remaining": 120000,
"application_fee_amount": null,
"attempt_count": 0,
"attempted": false,
"auto_advance": true,
"billing": "charge_automatically",
"billing_reason": "manual",
"charge": null,
"collection_method": "charge_automatically",
"created": 1010000002,
"currency": "usd",
"custom_fields": null,
"customer": "cus_NORMALIZED0001",
"customer_address": null,
"customer_email": "hamlet@zulip.com",
"customer_name": null,
"customer_phone": null,
"customer_shipping": null,
"customer_tax_exempt": "none",
"customer_tax_ids": [],
"default_payment_method": null,
"default_source": null,
"default_tax_rates": [],
"description": "",
"discount": null,
"due_date": 1000000000,
"ending_balance": 0,
"footer": null,
"hosted_invoice_url": "https://pay.stripe.com/invoice/acct_NORMALIZED000001/invst_NORMALIZED0000000000000001PVAbb",
"id": "in_NORMALIZED00000000000001",
"invoice_pdf": "https://pay.stripe.com/invoice/acct_NORMALIZED000001/invst_NORMALIZED0000000000000001PVAbb/pdf",
"lines": {
"data": [
{
"amount": 120000,
"currency": "usd",
"description": "Zulip Standard - renewal",
"discountable": false,
"id": "ii_NORMALIZED00000000000001",
"invoice_item": "ii_NORMALIZED00000000000001",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1362193445,
"start": 1330657445
},
"plan": null,
"proration": false,
"quantity": 15,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem",
"unique_id": "il_1Gb9aAGh0CmXqmnwYD2vuFL3"
}
],
"has_more": false,
"object": "list",
"total_count": 1,
"url": "/v1/invoices/in_NORMALIZED00000000000001/lines"
},
"livemode": false,
"metadata": {},
"next_payment_attempt": 1000000000,
"number": "NORMALI-0001",
"object": "invoice",
"paid": false,
"payment_intent": "pi_1Gb9aBGh0CmXqmnw3RdjXFtK",
"period_end": 1000000000,
"period_start": 1000000000,
"post_payment_credit_notes_amount": 0,
"pre_payment_credit_notes_amount": 0,
"receipt_number": null,
"starting_balance": 0,
"statement_descriptor": "Zulip Standard",
"status": "open",
"status_transitions": {
"finalized_at": 1000000000,
"marked_uncollectible_at": null,
"paid_at": null,
"voided_at": null
},
"subscription": null,
"subtotal": 120000,
"tax": null,
"tax_percent": null,
"total": 120000,
"total_tax_amounts": [],
"webhooks_delivered_at": 1000000000
}
],
"has_more": false,
"object": "list",
"url": "/v1/invoices"
}

View File

@@ -1,295 +0,0 @@
{
"data": [
{
"account_country": "US",
"account_name": "Dev account",
"amount_due": 152000,
"amount_paid": 0,
"amount_remaining": 152000,
"application_fee_amount": null,
"attempt_count": 0,
"attempted": false,
"auto_advance": true,
"billing": "charge_automatically",
"billing_reason": "manual",
"charge": null,
"collection_method": "charge_automatically",
"created": 1010000004,
"currency": "usd",
"custom_fields": null,
"customer": "cus_NORMALIZED0001",
"customer_address": null,
"customer_email": "hamlet@zulip.com",
"customer_name": null,
"customer_phone": null,
"customer_shipping": null,
"customer_tax_exempt": "none",
"customer_tax_ids": [],
"default_payment_method": null,
"default_source": null,
"default_tax_rates": [],
"description": "",
"discount": null,
"due_date": 1000000000,
"ending_balance": 0,
"footer": null,
"hosted_invoice_url": "https://pay.stripe.com/invoice/acct_NORMALIZED000001/invst_NORMALIZED0000000000000003u5uCV",
"id": "in_NORMALIZED00000000000003",
"invoice_pdf": "https://pay.stripe.com/invoice/acct_NORMALIZED000001/invst_NORMALIZED0000000000000003u5uCV/pdf",
"lines": {
"data": [
{
"amount": 152000,
"currency": "usd",
"description": "Zulip Standard - renewal",
"discountable": false,
"id": "ii_NORMALIZED00000000000003",
"invoice_item": "ii_NORMALIZED00000000000003",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1393729445,
"start": 1362193445
},
"plan": null,
"proration": false,
"quantity": 19,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem",
"unique_id": "il_1Gb9aEGh0CmXqmnwbJpsbILw"
}
],
"has_more": false,
"object": "list",
"total_count": 1,
"url": "/v1/invoices/in_NORMALIZED00000000000003/lines"
},
"livemode": false,
"metadata": {},
"next_payment_attempt": 1000000000,
"number": "NORMALI-0003",
"object": "invoice",
"paid": false,
"payment_intent": "pi_1Gb9aGGh0CmXqmnwehKKVEuG",
"period_end": 1000000000,
"period_start": 1000000000,
"post_payment_credit_notes_amount": 0,
"pre_payment_credit_notes_amount": 0,
"receipt_number": null,
"starting_balance": 0,
"statement_descriptor": "Zulip Standard",
"status": "open",
"status_transitions": {
"finalized_at": 1000000000,
"marked_uncollectible_at": null,
"paid_at": null,
"voided_at": null
},
"subscription": null,
"subtotal": 152000,
"tax": null,
"tax_percent": null,
"total": 152000,
"total_tax_amounts": [],
"webhooks_delivered_at": 1000000000
},
{
"account_country": "US",
"account_name": "Dev account",
"amount_due": 5172,
"amount_paid": 0,
"amount_remaining": 5172,
"application_fee_amount": null,
"attempt_count": 0,
"attempted": false,
"auto_advance": true,
"billing": "charge_automatically",
"billing_reason": "manual",
"charge": null,
"collection_method": "charge_automatically",
"created": 1010000003,
"currency": "usd",
"custom_fields": null,
"customer": "cus_NORMALIZED0001",
"customer_address": null,
"customer_email": "hamlet@zulip.com",
"customer_name": null,
"customer_phone": null,
"customer_shipping": null,
"customer_tax_exempt": "none",
"customer_tax_ids": [],
"default_payment_method": null,
"default_source": null,
"default_tax_rates": [],
"description": "",
"discount": null,
"due_date": 1000000000,
"ending_balance": 0,
"footer": null,
"hosted_invoice_url": "https://pay.stripe.com/invoice/acct_NORMALIZED000001/invst_NORMALIZED00000000000000027eQ4i",
"id": "in_NORMALIZED00000000000002",
"invoice_pdf": "https://pay.stripe.com/invoice/acct_NORMALIZED000001/invst_NORMALIZED00000000000000027eQ4i/pdf",
"lines": {
"data": [
{
"amount": 5172,
"currency": "usd",
"description": "Additional license (Jan 2, 2013 - Mar 2, 2013)",
"discountable": false,
"id": "ii_NORMALIZED00000000000002",
"invoice_item": "ii_NORMALIZED00000000000002",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1362193445,
"start": 1357095845
},
"plan": null,
"proration": false,
"quantity": 4,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem",
"unique_id": "il_1Gb9aCGh0CmXqmnwp7mzzDq1"
}
],
"has_more": false,
"object": "list",
"total_count": 1,
"url": "/v1/invoices/in_NORMALIZED00000000000002/lines"
},
"livemode": false,
"metadata": {},
"next_payment_attempt": 1000000000,
"number": "NORMALI-0002",
"object": "invoice",
"paid": false,
"payment_intent": "pi_1Gb9aDGh0CmXqmnwTdKviFVy",
"period_end": 1000000000,
"period_start": 1000000000,
"post_payment_credit_notes_amount": 0,
"pre_payment_credit_notes_amount": 0,
"receipt_number": null,
"starting_balance": 0,
"statement_descriptor": "Zulip Standard",
"status": "open",
"status_transitions": {
"finalized_at": 1000000000,
"marked_uncollectible_at": null,
"paid_at": null,
"voided_at": null
},
"subscription": null,
"subtotal": 5172,
"tax": null,
"tax_percent": null,
"total": 5172,
"total_tax_amounts": [],
"webhooks_delivered_at": 1000000000
},
{
"account_country": "US",
"account_name": "Dev account",
"amount_due": 120000,
"amount_paid": 0,
"amount_remaining": 120000,
"application_fee_amount": null,
"attempt_count": 0,
"attempted": false,
"auto_advance": true,
"billing": "charge_automatically",
"billing_reason": "manual",
"charge": null,
"collection_method": "charge_automatically",
"created": 1010000002,
"currency": "usd",
"custom_fields": null,
"customer": "cus_NORMALIZED0001",
"customer_address": null,
"customer_email": "hamlet@zulip.com",
"customer_name": null,
"customer_phone": null,
"customer_shipping": null,
"customer_tax_exempt": "none",
"customer_tax_ids": [],
"default_payment_method": null,
"default_source": null,
"default_tax_rates": [],
"description": "",
"discount": null,
"due_date": 1000000000,
"ending_balance": 0,
"footer": null,
"hosted_invoice_url": "https://pay.stripe.com/invoice/acct_NORMALIZED000001/invst_NORMALIZED0000000000000001PVAbb",
"id": "in_NORMALIZED00000000000001",
"invoice_pdf": "https://pay.stripe.com/invoice/acct_NORMALIZED000001/invst_NORMALIZED0000000000000001PVAbb/pdf",
"lines": {
"data": [
{
"amount": 120000,
"currency": "usd",
"description": "Zulip Standard - renewal",
"discountable": false,
"id": "ii_NORMALIZED00000000000001",
"invoice_item": "ii_NORMALIZED00000000000001",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1362193445,
"start": 1330657445
},
"plan": null,
"proration": false,
"quantity": 15,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem",
"unique_id": "il_1Gb9aAGh0CmXqmnwYD2vuFL3"
}
],
"has_more": false,
"object": "list",
"total_count": 1,
"url": "/v1/invoices/in_NORMALIZED00000000000001/lines"
},
"livemode": false,
"metadata": {},
"next_payment_attempt": 1000000000,
"number": "NORMALI-0001",
"object": "invoice",
"paid": false,
"payment_intent": "pi_1Gb9aBGh0CmXqmnw3RdjXFtK",
"period_end": 1000000000,
"period_start": 1000000000,
"post_payment_credit_notes_amount": 0,
"pre_payment_credit_notes_amount": 0,
"receipt_number": null,
"starting_balance": 0,
"statement_descriptor": "Zulip Standard",
"status": "open",
"status_transitions": {
"finalized_at": 1000000000,
"marked_uncollectible_at": null,
"paid_at": null,
"voided_at": null
},
"subscription": null,
"subtotal": 120000,
"tax": null,
"tax_percent": null,
"total": 120000,
"total_tax_amounts": [],
"webhooks_delivered_at": 1000000000
}
],
"has_more": false,
"object": "list",
"url": "/v1/invoices"
}

View File

@@ -1,24 +0,0 @@
{
"amount": 120000,
"currency": "usd",
"customer": "cus_NORMALIZED0001",
"date": 1000000000,
"description": "Zulip Standard - renewal",
"discountable": false,
"id": "ii_NORMALIZED00000000000001",
"invoice": null,
"livemode": false,
"metadata": {},
"object": "invoiceitem",
"period": {
"end": 1362193445,
"start": 1330657445
},
"plan": null,
"proration": false,
"quantity": 15,
"subscription": null,
"tax_rates": [],
"unit_amount": 8000,
"unit_amount_decimal": "8000"
}

View File

@@ -1,24 +0,0 @@
{
"amount": 5172,
"currency": "usd",
"customer": "cus_NORMALIZED0001",
"date": 1000000000,
"description": "Additional license (Jan 2, 2013 - Mar 2, 2013)",
"discountable": false,
"id": "ii_NORMALIZED00000000000002",
"invoice": null,
"livemode": false,
"metadata": {},
"object": "invoiceitem",
"period": {
"end": 1362193445,
"start": 1357095845
},
"plan": null,
"proration": false,
"quantity": 4,
"subscription": null,
"tax_rates": [],
"unit_amount": 1293,
"unit_amount_decimal": "1293"
}

View File

@@ -1,24 +0,0 @@
{
"amount": 152000,
"currency": "usd",
"customer": "cus_NORMALIZED0001",
"date": 1000000000,
"description": "Zulip Standard - renewal",
"discountable": false,
"id": "ii_NORMALIZED00000000000003",
"invoice": null,
"livemode": false,
"metadata": {},
"object": "invoiceitem",
"period": {
"end": 1393729445,
"start": 1362193445
},
"plan": null,
"proration": false,
"quantity": 19,
"subscription": null,
"tax_rates": [],
"unit_amount": 8000,
"unit_amount_decimal": "8000"
}

View File

@@ -1,33 +0,0 @@
{
"card": {
"address_city": "Pacific",
"address_country": "United States",
"address_line1": "Under the sea,",
"address_line1_check": "unchecked",
"address_line2": null,
"address_state": null,
"address_zip": "33333",
"address_zip_check": "unchecked",
"brand": "Visa",
"country": "US",
"cvc_check": "unchecked",
"dynamic_last4": null,
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"id": "card_NORMALIZED00000000000001",
"last4": "4242",
"metadata": {},
"name": "Ada Starr",
"object": "card",
"tokenization_method": null
},
"client_ip": "0.0.0.0",
"created": 1010000001,
"id": "tok_NORMALIZED00000000000001",
"livemode": false,
"object": "token",
"type": "card",
"used": false
}

View File

@@ -1,54 +0,0 @@
{
"account_balance": 0,
"address": null,
"balance": 0,
"created": 1010000001,
"currency": null,
"default_source": null,
"delinquent": false,
"description": "zulip (Zulip Dev)",
"discount": null,
"email": "hamlet@zulip.com",
"id": "cus_NORMALIZED0001",
"invoice_prefix": "NORMA01",
"invoice_settings": {
"custom_fields": null,
"default_payment_method": null,
"footer": null
},
"livemode": false,
"metadata": {
"realm_id": "1",
"realm_str": "zulip"
},
"name": null,
"next_invoice_sequence": 1,
"object": "customer",
"phone": null,
"preferred_locales": [],
"shipping": null,
"sources": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/customers/cus_NORMALIZED0001/sources"
},
"subscriptions": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/customers/cus_NORMALIZED0001/subscriptions"
},
"tax_exempt": "none",
"tax_ids": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/customers/cus_NORMALIZED0001/tax_ids"
},
"tax_info": null,
"tax_info_verification": null
}

View File

@@ -1,54 +0,0 @@
{
"account_balance": 0,
"address": null,
"balance": 0,
"created": 1010000001,
"currency": null,
"default_source": null,
"delinquent": false,
"description": "zulip (Zulip Dev)",
"discount": null,
"email": "hamlet@zulip.com",
"id": "cus_NORMALIZED0001",
"invoice_prefix": "NORMA01",
"invoice_settings": {
"custom_fields": null,
"default_payment_method": null,
"footer": null
},
"livemode": false,
"metadata": {
"realm_id": "1",
"realm_str": "zulip"
},
"name": null,
"next_invoice_sequence": 1,
"object": "customer",
"phone": null,
"preferred_locales": [],
"shipping": null,
"sources": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/customers/cus_NORMALIZED0001/sources"
},
"subscriptions": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/customers/cus_NORMALIZED0001/subscriptions"
},
"tax_exempt": "none",
"tax_ids": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/customers/cus_NORMALIZED0001/tax_ids"
},
"tax_info": null,
"tax_info_verification": null
}

View File

@@ -1,54 +0,0 @@
{
"account_balance": 0,
"address": null,
"balance": 0,
"created": 1010000001,
"currency": null,
"default_source": null,
"delinquent": false,
"description": "zulip (Zulip Dev)",
"discount": null,
"email": "hamlet@zulip.com",
"id": "cus_NORMALIZED0001",
"invoice_prefix": "NORMA01",
"invoice_settings": {
"custom_fields": null,
"default_payment_method": null,
"footer": null
},
"livemode": false,
"metadata": {
"realm_id": "1",
"realm_str": "zulip"
},
"name": null,
"next_invoice_sequence": 1,
"object": "customer",
"phone": null,
"preferred_locales": [],
"shipping": null,
"sources": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/customers/cus_NORMALIZED0001/sources"
},
"subscriptions": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/customers/cus_NORMALIZED0001/subscriptions"
},
"tax_exempt": "none",
"tax_ids": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/customers/cus_NORMALIZED0001/tax_ids"
},
"tax_info": null,
"tax_info_verification": null
}

Some files were not shown because too many files have changed in this diff Show More